var/home/core/zuul-output/0000755000175000017500000000000015153657606014543 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015153664371015504 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000236675515153664311020303 0ustar corecorehikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9GfͅR~3I_翪|mvſFެxۻf+ovpZjC4%_̿f\ϘקjzuQ6/㴻|]=ry+/vWŊ7 .=*EbqZnx.h{nۯSa ׋D*%(Ϗ_϶ݬvGR)$DD D~m{]iX\|U. $ॄKЗ/83Jp ώI8&xėv=E|;F}Zl8T*v (6pk**+ Le*gUWi [ӊg*XCF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5a|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'P'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJBR_v'5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F P |ɧ<Ғ8_iqE b}$B#fethBE;1"l r  B+R6Qp%;R8P󦟶Ub-L::;Ⱦ7,VW.JE:PgXoΰUv:ΰdɆΰ (ΰ0eTUgXun[g, ׽-t!X򴱞_aM:E.Qg1DllЊE҉L ehJx{̗Uɾ?si&2"C]u$.`mjmƒVe9f6NŐsLu6fe wkىKR%f"6=rw^)'Hz }x>1yFX09'A%bDb0!i(`Z;TyֻΗ|ִ0-6dAC5t[OM91c`u.EkB6ga׬9J2&vV,./ӐoQJ*Dw*^?sCeyWtɖ9F.[-cʚmD (QMW`zP~n"U'8%kEq*Lr;TY *BCCpJhxUpܺDoGdlaQ&8#v| (~~yZ-VW"T- 0cܖ64BjχKw2/v2dr17>@ek~iQ){A Ă۲DK0D"\KjPQ>Y{Ÿ>14`SČ.HPdp12 (}< ݳln"sv&{b%^AAoۺ(I#hKD:Bߩ#蘈f=9oN*.Ѓ M#JC1J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd=d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^u I66|*f\#ߍpo8sx[o%~wS`ýͽ>^U_S1VF20:d T2$47mSl*#lzFPȟW֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[clzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X  .xI_7ve Z@7IxQpɡr~]Si!ڣZmʢ鉗phw j8\c4>0` R?da,ȍ/ءfQ 2ؐfc}l 2窾ۉ1kw |VvlK۴ymkiK_oK`8 )v3v?נ:b(v6& `-K;~:|F6vXpw*t]r@ 5 ƻ7۱ַ P񷍋 3)Cl^]U҅yY9 &K<-na'Xk,P4+`Þ__e^0)wFO.= w ?>ȑ3n?z,t s5Z/ Clo-` z?a~b mzkC zF/}b&x Uhm.O 4m6^^osVЦ+*@5Fˢg'!>$]0 1_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\o?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nKv8M  /`-u+[w 5쭅]7T=1:T k`8"jѡ6֙ɖ@/#ut{xH9~!24 6e/!~=f)Q UbshY5mseڠ5_mTDNIGW .Z#YmDvS|]F)5vSsiExţ=8#r&ᘡĩDȈ\d cRKwQS=~O:*xj Y %dRwoJarExfKB4t@y[6Om *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9p/Ǜ>Y+yn~F8I !6WB3C%X)ybLFB%X2S6vw8uUF+X|YukX?xVO_\ڏv8ulGU߽сHS1))e)`pgZ?јha_n>uZm8pnglK+5EekV{:_d* |a%ĉUHSPV{^wKy⩳]x99?u0軡*uG"  f.s{hkN<ƽN!n<#)u6-anOIq;6z( rx? euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{D1kl)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua Ȼݔhvׄӫ A^%f+[`sb˟ _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/>l8MrHID2VSsMX^"NۯDc558c&'K0L /C5YDqNe~ض˸nErc֋@aw*r܀0 a {RQ^xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$wMm[eG`̵E$uLrk-$_{$# $B*hN/ٟPE[/Y5d{zrBܖ6Hlc "mKv~[uLU4lZ;xEN'oI㤛rP*jC# 6@dmHg1$ʇȠh#CBΤ{sTQ{%w)7@y1K^ ].Y$46[B-3%OONw8d`Q4d$x0t8@t]y1T\YAidtxBG:pɨyeNg4n]M؞ e}Wn6׳i~'ہZ*FU{fXڃP'Hl4 ,ŸqMHDCYZz Qnz܁$Jp04ȴIL΃.0FiO-qy)i_TA|S2G4miBȨHM(2hys|F 94 DNlϒòκ-q|xC ,gKDzHR%t+E/wd#礱ºȄWEz o\JξB.wLKZ39(M +(PWՇfR6#ю3Ȋt ݪbh]MTw䀩S]'qf&)-_G;"1qz퇛0,#yiq$ՁɄ)KٮޓJ|̖D?:3mhW=rOf'/wѹ8BS8]`;=?,ڼ"ϴq*(A7? /W= #^ub"6q f+=^OI@߱^F[n4A#bYѤwd)J^Z{*ǥzw73LuaVad=$6)iI gC~.1%YmҪ+2gSt!8iIۛ*JgE7LGoş\bC}O i ycK1YhO6 /g:KT sPv6l+uN|!"VS^΄t*3b\N7dYܞLcn3rnNd8"is"1- ޑܧd[]~:'#;N(NknfV('I rcj2J1G<5 Nj̒Qh]ꍾZBn&Un' CyUM0nCj.&Oڣg\q0^Ϻ%4i" ZZG>Xr'XKc$2iσֹH<6N8HSg>uMik{Fm(W F@@{W+ߑ?X2hS4-=^YgpUHެbZ!y!ul@ڼ63" ۩:6=TZõ$E,ϓRV|G&$rr;J TtIHFE=RȬ]P pLm|?$%>Eü%mWO[>Xmw,*9.[G n >X8Ī;xW%dT:`ٓ~:QO,}j6j!yڦʲT:Pqҋh] H+&=>g| Z;D8ܶb:! Å{2:+au 6:!fF+0#+̬NY"!6a7#񕪰%:r|o5Znڧs?si/W qEU馥˟^_޶oڷOj'?nc]Rn\t3^邳塨Lɏ"8k8M~?M}OAH$77f|lgn I;.K*!<+"eK5c&`X:#;@B@[(K44sBFu M.MNWLlY]K᜴=/ VމYlϿ4i36$>m|_>9|dUA"{!$jKx E$K3hN(tÊ-#v#O N, 9g80Ǭ&VdӞ5W1!1KYd`,-*&>F~⯰&jb.~cNk BL_OG]Bv.A|'qT(Ol.' 4IE|@Iі)<-p JkQm1 `qacܗVc?)cl*&<}P媠E{-sVU>߇GUt\+n3X]Byoz)li$2cPs6D>TE-n# rve{椱I |p)U݋7yJw&PzDgi xs  xh\L r Ѥo Zt(I >|$>tnMdэor,8.7uO`c Nc0%Ն R C%_ EV a"҅4 |T!DdǍ- .™5,V:;[g./0 +v䤗dWF >:֓[@ QPltsHtQ$J==O!;*>ohǖVa[|E7e0ϕ9Uyzg%pg/cc6RS`HFLЩ LkJu\!`0);Sak$Vfp~C%YdE6c>1ƕ (0W4Q>@>lWN"^ X5G-nm.8B>NOI[31,j2 Ce |M>8l WIf|\q4|UkC.gr`˱Lϰ} xr.~l-ɩu_Drd31V_ѺUib0/ %IYhq ҕ  O UA!wY~ -`%Űb`\mS38W1`vOF7/.C!Pu&Jm l?Q>}O+D7 P=x@`0ʿ26a>d Bqε^a'NԋsI`Yu.7v$Rt)Ag:ݙyX|HkX cU82IP qgzkX=>׻K߉J%E92' ]qҙ%rXgs+"sc9| ]>T]"JرWBΌ-zJS-~y30G@U#=h7) ^EUB Q:>9W΀çM{?`c`uRljצXr:l`T~IQg\Ѝpgu#QH! ,/3`~eB|C1Yg~ؼ/5I7w9I}qww}U~7뭱ԏ,}e7]ukDn`jSlQ7DžHa/EU^IpYWW兹Q7WyTz|nˇ _qˍ[!;n ^b k[);ng]ȶM_u)O_xV hx h[K2kـ`b duhq[..cS'5YO@˒ӓdcY'HAKq^$8`b $1r Qz?ۧ1ZM/G+qYcYl YhD$kt_TId E$dS:֢̆ ?GЅ'JƖ'ZXO݇'kJՂU086\h%1GK(Yn% ']Q; Gd:!gI-XEmkF}:~0}4t3Qf5xd\hEB-} |q*ȃThLj'sQ %؇Gk`F;Sl\h)5؈x2Ld="KԦ:EVewN ًS9d#$*u>>I#lX9vW !&H2kVyKZt<cm^] bCD6b&>9VE7e4p +{&g߷2KY,`Wf1_ܑMYٚ'`ySc4ΔV`nI+ƳC6;җ2ct"*5S}t)eNqǪP@o`co ˎ<عLۀG\ 7۶+q|YRiĹ zm/bcK3;=,7}RqT vvFI O0]&5uKMf#pDTk6yi*cem:y0W|1u CWL;oG^\ X5.aRߦ[_Vs? Ž^A12JQ̛XL:OEUپOY>WK-uP0\8"M: /P4Qz~j3 .-8NJ|!N9/|a|>lX9T ҇t~T1=UF"t; 8-1I|2L+)WȱL˿ˍ-038D*0-)ZyT13`tTnm|Yhi+lQ&Z!֨řoҒ"HKX 6„=z{Ҍ5+P1;ڇ6UNE@Uo/>8.fgW]kY0Cgcu6/!_Ɩ} ' Ў3)X<seWfSv!ؒRKfs%(1Lhrٵ L.] s?I,HCԢ[b C-lLG+@_$c%* _jR|\:dc5u= A@kUc\ǔz;M>dUN/aFRĦ@x؂ǀ$6%}N^ \mQ!%8j0dUo=rh>*YȴU3Q,̸*E%59sTzɟڮ2kg ۱wEUD3uKrr&"B:p`\E)j<).R&#ÃecE,dp"nPS 44 Q8ZƈKnnJei+^z '3JDbSK;*uБ:hF ѹ @˿ޗ~7g9| hLXULi7.1-Qk%ƩJ4^=ple;u.6vQe UZAl *^Vif]>HUd6ƕ̽=T/se+ϙK$S`hnOcE(Tcr!:8UL | 8 !t Q7jk=nn7J0ܽ0{GGL'_So^ʮL_'s%eU+U+ȳlX6}i@djӃfb -u-w~ r}plK;ֽ=nlmuo[`wdй d:[mS%uTڪ?>={2])|Ը>U{s]^l`+ ja^9c5~nZjA|ЩJs Va[~ۗ#rri# zLdMl?6o AMҪ1Ez&I2Wwߎ|7.sW\zk﯊溺^TW^T\*6eqr/^T77WNZ7F_}-򲺺VWQ77V\_v?9?"Th $LqQjiXMlk1=VzpO֠24hf 1hi D{q:v%̈#v^nBi~MefZF >:/?Ac 1M'I`22؆DT!/j璓P åiw@wgRCsT~$U>ceއE)BI>UljO|Ty$ŋrwOtZ7$ "i 8U 7bSem'k?I=io۸ȗP[7I;6m^} D拶jqP;MBGmݨ-YxxvV@2c-E,(+Q,=ɑ L`H0VX$[%4_?V1ia2,< D(5xy!Jݕ66 Jc.y݌V{< FE.jslMegY|%<2<đ LM|Mh@$<OxcUT4Zxc밸HczAkyL(LuuscXU#'ᒏNE3.h4/g4Oq2]_pzg;Gqv5V ))ҶSRzSRz +#K}#4S3yt=в 7\, w7jx~/*68,eQiўMXL];̱7b$!FLۦmQX+*؃$E˳yG>\U֏˶}7U2,UhXݦxO-Z 1"c X40M]d€w&ݔxdm.oP%Mp>#nZ}F0`%ɧvKrRq|/INXH_>D3s=;%C{F<ٷ( <"|w$ g+ZPkתhL i2$g }ϦSS7 ݼ0Ʌ\eO-kGŚGY3}4,Q3J^?b@ḧ́c\oWtLu\`B M{}@@z(~d/|Œ_"t_r%@,k!!ɏ |r.;zER&uF8ubI˥gz}#žC I Z^t@!B"2 KuN~b%L,%ٶǨ5ϬDQzڋRy3>3"<ϳlN02gD7-3aSQh5jg8 hd~FFu!Ɓ'#Yqn}1,,,idZ^4AyDE<{ ~ÇrE|ft1Ѵ[ҴW$\)3s Vg)IN `r1O$OWN%tɬI9.y,_w-捀t2||=', YEr֬oeX/ڛPg(+& D*oj>;VR'\Xcu|Ig0KZ TED)Nq,X$[@$_m/,ZMMkBp/.|AA7\$tf[i-3< bISdד&i.F эqU%˪5k 9 {&{I 1[1wrB}|o]@o`CwͶǷנx1JhcSyX Q=`ĢpPAoi(&뿣X4=(4lh^&<| `fIi"Vt֒$};4 omx7'mZ5o>Ll1iQ( qgp+.- 3dy< X"FXPy I~ڭ˹Guz̰:^ h4x+wS^]xfaYs_|pdm/p rvZ鉿eQ w[u ѤXy1dRYIpPH!D3\S2cAEdeA[V|EѝI6|慈DG==|xcBʴb \35+( SuVQK9,kQak˵w/i\-O<Ho[o2;M;I{c:dW]>z\Urs=$eEZtJw|+&ʜAq@Quy_d3 oWDz8^ ,_Y =BQs/߅6r2?mb%-8AV]G1 OhЊ8/q6#G5.UCpHugMMTqjg/Z*BNN{p+G8AלQŲoaO7# .b(%{w'F@6P6wЀWiuZ҈FҰ_*i$Y]j('8q=kIFBypEE"Cz+ @鐨2δ+, qck К-Li1n99;<8_l6'M5 +l4+;6uz X *M`Y80M(Kf]PL;?JYhq@WF561ͭLW.Mk A0BJZ* A}@|%+00@a,@[IA34pM`mfJwo,4X<X&gaIS§4l|JvbRwe)OuDDww7\4/7\qz_^7qp-pK;<)y|DjgrYX#_s!/B h>8R2!ȥ'O =PzMB!\uUYļ¨ԾWZM51Qr3,]7Հ}CjυeM@Ӵn!7GCk GRSYڸj^UƸ5@AXҖT\C'n/!6z?[?NV"Z uY%lUqy%.J|b=/|(ٔXsාTl\TIz<(jK$V_RaHT'\fkUARuc3SvoUZTy a U_Kl5HredjtN'QV)Ո!H5UjX^\E[cS4&KJX4UCn&-*ҧ6´Z L[fu/9ϥg' h NV]yIE@jk(ZBrmZ6=dhH] 0ݦg ˳R2x*q>Fu:I&c~MNȂe)>,HtVU>TF oKA'-K=rkR/Bm)KZ¾]f18Lj .ΩlEw)FTU8ږ'b궓}-b*rP呞;RnrP]=UʦweWYPG8|'^CDk/%CJQeu$m~Ux4 ᫑h.QkX-j@4n4Qfq4!m.iu C`.sAu=y^2/𜞐KP2%OhVf[zBfPsuo/W X H}u;YP/вh]IZ]$u*ੈxYiu#Mdn lUVk vu5l9[b1ޞ* +R.ufwq iۺ} @a: Ȋj5cPDNO_}x`S\àᶡhX6DNS|@%_ʾxPd1޻㿭fk<~걷%TvOoWBbfH:}xX6aѬ$>s\n3ZMں!Yhe(+(*qUa휫r=KMoZ ֝eaTרl+KjkKs84EԵ$0 X|7 "ߌcX0u)t'q.cz\L`x< xg t%.["xXp]@9DlM lU}_rlϓCZ7KMqH:Dq}'2\rVmg=nxwFlޢN@sKF{kS8`D)ci^c3zXp,^5e.>|]a6JѥQjwpnN4$Uum}cU-I?D3 CQa%ڱf!\ :HxG{o3э.< XpxQ`]գp8#!>#=ޫ {08xKlD"qLcf"}CL!SSwT#ql\@oq%3q0fH 3v@ߋ3f[ G"[t?fL+qMlk{5 t>牋1s!tL;Κl~ǞƤܞ_O8VI ;1ǴiY(;p'{3>IM#>bR7'{n)e|D )+~Vl&4 b#5ye^3q5T!{]ƃgZxǑ_!m[@0ȤL;Ӎ@D۶<1HJc'mJG}73D+X$ |Ҷi=:^>R]jAm囙.|wͥ;GFD: Bm}eiIZ Am?MaQg2\6Am5i8S÷kLAѥJ}^Ilfs(:gO:9Cf&V#`ƦnʼD|G ?OSI=߳!~5%dJ={^e '4..fUtffd>/!HWJ'+M-3PE`X"ZˊwϺf} [6pYX+`ŸG}|Tuv_w"~ cۥ"rYG+c/LC>똋A<%/ vg@ <-~]6q 3bmh0 îCPׁ>1IFE־@@ sELfb~'ŀH>9ECq (V%q(uX1`鴟xlCodfΏx=ř÷;EH6C )B A.c?^6rn/] F8 BqN0岕xC&ɜP.2gh6 ᰮP#@RUNb1Y?KnrKt'$ eM!(p d" >eHF v&ID0_J7BI#9zXf\ kDiDЗ$y F@鏘|.cۈX_YkM]$yw$U$-8g݊/rdAm 눏[qs ?7a0>یcrCҲ'N=* %ۢs7g`Lqgeq`?~>`/a"nlt]ax(HVYvN6!/"I<sũa_WɺgU4Oh?UGxyeY*dE`w1'3b}Fy?sDfɼt ˢ*Kam ŋ؅Skh߃CrH)VOa.՘Vcʴ:߸aV &\H*RI)ĥZR/@֞UJ~ 嶫X [kXqK/彡{=A:L~4Hu;jmcQ~T%%Ͼ 7w8I3 #}H{ L lHt7YDtOw)ѝq'\9J~j}^q"<񯢌(S9* lT!QDP6FՒpͥ.Wu9\Z oJqRnװv <& vt8Wy k%ׯ/tV0+C̎1%+]è']w[M{CqPy> 1w зO C&0J41>˔Pvnv@d³a:. G=3d< bP34( }S^M'`8>Q<#^d$KdU`](f[*K 3"DManq)r]Q 4Xq"JdpԯB{7u#`xw0ck«WQDyކu!Zػ6lWf|~mmF)ﱷWlG*#ߍ$J`ԛV q4D?WuS=} N,6@|&gŬhaz+b+U%gWLO 7`[o$> 6>Ar!{uzi607hËr$ B^1[]#([m!(^P4Aَmץl>e;t*{b]mկ;vݥkڷ;w6W[˷?MT| AUA-~[ l!uvBPwUPw Au&zz[m/4A_BP{A ( &«*J lCnUWH%}Hv-ʊeF0PV&:vKbl\ӧ=}y?N6.`ϏYq>ʏU t/Afe^e}>]܊SⰠ:Y=FnרeA|\9e]8mbԫ$T7b1"*0>NO`V V{0,i.Eß8y $m܀kt: -y-qjr\!,K2+1!.Ut:K{[;91>q0j!%xLῲdN14(`lat4ZxVdN땥BJH@I6H{p LMZ[yhyǣS鿬e(@yUX~<͏57O \Muog3`-xrFb }N2Le-KYd; 'l8eVWk%N";t7>x W !?ȟ1C%HPF4\]'\ I|"_p-3c tg r+ vE·O{+ F $e$US"w>s@Tπ #XNUC-Ȟ XEA !=`-ӑ8O$k@ޞ#0 {6޾^ҞYyk\g)1;'frƅ"Ad&E1e^=21"Bf"f#.YeKfjo)Q$ʆkGVf;"@ elYwXBuڳ0"$j.rآ(4GZ@b>I/%SkT톛Pq"Xk"ၾ6A8Eݴ!b &b )G!?4o`Aȡ!&hD>BRO VxAc;Fi]dYqDMB>T fQ>-Xnق NrTS"ൖ]'` nWqQ9}؃_4N uqKo\-M% ])WbpK.LgUn/A>HkI1Hqadf}^wٜlEEvU?>wfn}̉:<g_zc\칔TiFuP6ЦҦ f=ooO/5;>ew!9&?-95G)nwbuO-&pJyG۹S'}h~?}Ϧ`/E,::EĞCe}Dr OɠJ>PʧP(dVMpV'apj$T\Z̈,zP1>[0Gڥ\KОwTJgc"zX-k4-DĞ߰&۬6ڜ7m(yBX;&. )IE noPM u%kX;i d@g7ej&cI G,TosYTr"C+][o[G+I0=lq:c?lʗ`sZK` XL;k> 3?|'vw.]'NtZ}W{rk`_x*Bdiռw?ZAZĈ}`;)ꩺo9K[u>&AgZRcuNk7o1ؑǎɈ]{=q7MLĵ^p>/ͬ}ˤA1]JfDJ&V-M6K8KEbB?, vV. acJ,Ҙ-QJʮ| ZIOu@P$P[6NeZR1]Lї"=c$b+ pQ RbZipU&_:8͒p| !k8H:׏O$88ꐕE0};,Km,*䖳Xg"3룕w}b3䎀\ySx=ⲱհgp4T+qHa\{3WZ+YK[,,1e41np Ӝ"&UZb;- ‘ingIpfQ堄(rf~LrpkX3|,oSB"Èy=5jQ뷛}9zoW#g=bp4Â׎hV6Ǡpb$0$O Mf;SdTTpc3H0Xb=]JrgS!jxk>KTˈ#=tҹ3w wJæ gZDQʔ7YN\:.2ƛ[!Ύ_lJԬ4tPjX+*hp,80DkBsXgC3p΂V Lkxݬр(8$vӺ*JEa8`0|kB rMG7I{H<>=W]J;YvZ= I, .;$W,0"$!VjQ;{cd>l@Ū mfCpB~]5rD¥(\h:' N+DTC סנ 2tDbXV)#KhػPⓋI $d$jPo$B$xĺ[H:nt9)9oXO}6d*L!AA;-U9T) ٖF_aÁGto%vp#ץlQ}4/2/Sl.<tT;Y].]{C%zԫf8bʈӏsOiccDС^oKz^ #Efo0qNM\菾K?ƂO%X`z4k-qZ&vd^yv__ vz6^rجD(&TBpD,.5m,oؚFyJл GfjTM%D-]<,v!VӶ^ShuJԾYZAr)Ynֈv(8r0=gp\3IIi;|0&ϸG7E1ѸҴFVU0G7+5TcaTvXSwL}"$U)xe$Ȫ:BtɬtNxG7!LiZv+S=gs=?==fO2+d :yNl2~5js^՗W[ ;wfSfnKbV5=,RZrՃ{=>=lumCht@ҿ1M{L"Rx%ΡR"  v`OTC/TXʜ12bL ,Ƀ%^ N̤"U&3şrn)&Vtv}IG)u^4a-Ǐ$(h`iyR-c$bXh- `0]O1z@2r#xҫK/N:8ow$A\j\Ru6zŠqtk0>oxtB9nNCg3zdޚj<כV,\3zB =-IBl%K&#$zM׆y SYf4ImtJַn>c2=t3|L|͐ ca4}s\/K^1nmWD`ۛ8n!v)5M]?uށ}ts!WI KgGN곓:\u ÁzVOq/$wwd9+W1^3zn)F _`$ XF eM Zy`ɤ42e-@IzuyrlH aQ)ARSR؏?ex)(W2qSIz# '.|2*]gN7ǰ+|aA BqFC+^XpFfHI_[/kK"~ $8eE3LG൯Ty,]$XI0SQ"$HbFbe,8#aQlޝ3>^8̣S{y)>V:ˤXsyb)&xf42V5X)y c4k<ݏ8omCL&7tS怩Dxcy鯪KN2(# #}Z>=aMcxi,E>Bӥ!]{.@]J|h {4OnI2֣9 /eRMO`Ȕ4CO:r\.rf1gI+| Ύgx{3`H#x(Uٖ|I#\21t5l6D|u-  8돎)S_ʪ.^GH:/7$8S53X{-3Q3 x|HRs*tMLm3BE.݆}A6l hoyo sI =O":^~Q1_wϛtܯ%% `R*:,T=t3;E$Ohߘin1%2oTs]b8bJJc[* 32bʆtdUj1-rt_Vsl 6znQhx -U'zKr襾tB_ uuFAd k fU p1Ƀ-փu$@HE: [ຫn:CaY4q + 锓?EuN2lkvq\`wqRY;lQROtQ3>88ziy󤉨&! k&Kiz໠oaK0[' l zǂy .UC!$(0zALmJDQj,`XZ`uIк(ҏY ̆9@L3*o hr!bCQ7|Seaa63]5N|lc7xeE0Ocp|*ݥ6 =?3{{s$8ER(d!{icyCmňaWC.K0n֍F 18bQ@%KvpCJZ"';1ε.#R]9bSHE8oF9:gGəm8!AgN_Vt73'9#{Q2PF/C2@2w_m U[ex(ETNMHc%rP"Ch௭ݲMc1bl Pr/g<$qp&1R k8XbB*s4TL#Znl] ^z2udU\m!1b08ا1 5Gyu=mKE1  i.-XLԝ{WƑJ&H}! 1nIadZP nMJ2I5RuSUz_KJW:n۲,0qC.pzP G`q $P X)sHJq7W̦8mYĭ-g,|t|łw^{b8`qF}W-ӧ&$i^J, 6stg YD^>\Ҭ axa<қ~==t{+ʪ}3KfcN|Hyj)8Ū }"4|qv eq0,H?O;0wi=7301]0=\Vxχ04"W!C|ߝ5Z:L.k"BO̚1ש%i?7BPLwaT5Ffӳ0:\!s8.,4cRxrK?-P~r N;?qRA)D3j%R:N Xٗ||U pMH,S@霨ToQ n֘}xf̿77ia«᠟0+;ο1z ȯw@-qve>=n?%x?궛>G# v N /Nzx.>,?'A|'~,HqѧIyh'o wPn K f)MόF's5ߌ''Ɂ(=--eW0F? D zu>>}6d0N7 'dTcD=Ƃw'ԓ:=a1W2Ljp0d l/fw%/ E; @ќ*zUnaJ;zh&a~TotB4Np>O,Q]sg73@֏Y^$WC( RMyeQXS墧Mz vS)^H^EА1tQX>PV_4;׬ɼvom^03ytNtFa?ny: 'M S;"fjǭ1XS'Ag&JΉJļޙ]E*[ixZg/A'gNg|6 L ?8'$s#θ rm. 2B b/W11g3a𗥖KALz}[]hߕ 9Z@?DsUiV U0$Y( <P>^_J*1B&m\r ~fڀ[3HdY'+ /i:*,!Ͼ^6 X /H7čqaTjBᨘ _R`2>hԬhlzu <*tkxĢs#Z&7IMao-_*T,CbJ7\t}NߓbWޝu5RS T{<:!IaeN+% 3Mp;K3,)}y[n@u;O֒;F/ѳhG!N]}⾓xZ3*𨗗Aͮ8[,P)=Q{NyeGFc,] Idr 7>\U%Z}UuxKFfTXT`#)E&/L–BF(tYI;մb)H +aOAcaa hYr}_ V-ZZq x^J.-N㴿H]Z߭m| )]eҊlM֚V^ZЗB4(כX_M}P<ėX@[e5H =4Wf[R"/ pZfk[Ʉepml߳Hihy89j^̒3E mFڰ ئU+>±V/=5ܴ{Z~Jp^njZN'y+7wz-e k~Pڄ KA>~K)5]dwqh5iJ HDWhhO"ü|aǡ҈} \=r \D;1fgk{{Km߷?noBeR.;r<ᨸ7R2ߣrcڹ F9 @H45сIa +P8J]g8y_gt>tg6CɅOIql3.,oPo<9#3e꜑LjqgLZ)rNJm:x̌N%1jTM'jM'1q'îTMחx15yI]DHI|qB2B9 TDCXNԌN:g] %GZ:~u 9plIB.gDY`Gj%D&S@",qiK05Y55ɜAEM0 ˜cV1륀$]< xۂ%[/H!?06}/ M/ǧoY߀(S@螬J> $B' Τ_Eb7Sx'K3J1Qw[:d'.#;qbϗìG{$.PjT / S~X7ɧ*~Taݸٗ͝˕͇ws}zuS9:zά@QT*.#4wf݋,xkz[ntܱM; `C|h` ݹquB^:̍&g~ no~wTFՌ(6;iPYwwB;6RsD6 ٭y@Ů4k(5%ViR|L7_ jY*PG|1[NA"XJ+dHE֮TDjuy41Vbu\p$Bvd_&|3ZNh7kǥq <+m/-pB15@lAj_bl sή-g^\p2^^JKT(uTq"X}_T7ry\2JnwwaZ.m a8\*`P?f;jVhv_ UʳGO^3Ffz`7_/W \dSdpp}WL?tS-.%8ZXgG"D'ѨFV$n&)5 䳼4iMP;xK$zvIn'”jdA( T{F"ň) ̸"Τ|M]LQD0{~G -Ll!Jr$5,M@=ӞhZܐ{nr--qcE bQ &B4YM %R&ڼcU Z8J020:尧Jz0`a qv$S]5y1MHVYdעEN`!ڣD--;aNJ0s gz,% B$Dj]Qd<ĹDPwBQrM1bMt[@6k VZ`n 4EܠlM+Ljd}V VCHl "0) CTEv;ˬ4bE$KPz3Ya -lU[H6FPjqPQ*[@|Iq{l g0.fJszʴgT}.{trDc0 Ϝq02$baEFB0FFjz@J\dY1FQy͝RTVE8SF=2 5R$bPbvBK2">0)7I DG.IɾuT?ǷrNJG.R ۸%&r(j"^C8h-=Վ{ !ED hZ@S.ih hd۽)W[l"S}S[A=ҁh Cn5AX#v' 098EZS ֈ!aG sB 0( In|Y{ .'<C8ަ >0CxCF+V7 /xY m4)$ R9%cItcaJ!|0#d\jU^ .pL# R/,h(!F ^7[k7hcdר%mGk&uz2ʯl+X[:T괝gȑҧ Ca.;3;xsmy%93 [ܒeiI(7$$>X/U*,T`HJS82)q2QW2-*2 Cl9lN1GdckM+l,&@ &sh:EK!X4ЊMd?$(&wݠD e/}ի;\^ɤ]*M4i?"ìѴ#XFI3B\Bn-"S,(Xȑ3n0GA;cyi\IzR&sko+طU扇~= ,l,84#P|`݂F?"և4MG#fB,4: 'so'x^KO26i?Won͕m)M߯R ZoINfw/j{o= `y3nr /cNiHeo#?=tm!Q&/byʐv@4uvG?}GfO7m^Ze{p[H0olޜy{Ó/ί(ʴ<ƕe 7mu&赘ne#\eSK Ӱlb܊Oq4&mAU)eovcڌ’4\5kbn8>$y/Lh$'C%ɨΧh_ӓ>|9eL{=tV>o.=þ}lvTrxl$}QoWP-to-Uԕ΢OtV9?'4拳jbLPGeyAxHCZUzy:>h t?ܿ_'8uc3^Hd$Ӂv`S I\:?9gůMm闲|mE%)/;׿mÛ)t.y Sz dd{Dg]~=)]OVG}p4v{aqzvѿ55y# ;PH+Y~HJx+ig^=#uR$ t7Š׳e6ޜ|dfVx&OsǏZr[I8E2 <(g!qOG%c=7T4^m_CzlN7~y^\ HCv@_Ek&lMG\e z7Zw#S1.U_.Z8"\@_ʙf-nɚ"& naU,Y :| O&03\Z.4tM%wygPJڍgFJYqK42+)J\fL)ܰZVYm S<:7EH)z[]"OS3&"O'wp- 5q`"Z FҬf{icivIᄉFH%.*.*m <6M*HRXz "Pcq WLr)IFt 5Cg P~# ֡f/m;jJƴhVo8ûhjhIg k좶uN2";-JKFO.-[CǓH{ {v0}r?OяW92L+`IPzW0qo{+,E@o4`~- GMxt.e^0&"o[7#Xbv[XeKi)Zs7]k&vCݘw<mLLrI=& z5lK2q4.v &0<0f2ŗĭ[ǖjWK(Cޛ5]jIuORZ@wtN `ݲ$&%o|&QYPg>0 |^>ؠnyXVK%}1$}-HET۹ut/|i٩FN5Ƙԩ`>F'뎮syib=ӛE9Ldn~ѓ\7'n<..e%xϞNiOi^.PifA-۠I Q4wҁ9^7k/t^N"QkUT32}F C )J|[$&`{G(0cd_UXC4kL,J~l!kY"v3(-mաnnwz`ݏCZXl9X-:nȉUp'XGq~p}Jv8`Ԩyv hZ8~tK:Rt+v%G:>Į[P~}ؒk>a[*+NUnU*aKVzxGTJ~2\vJH'FcjGׂly/gv~# FS0U;“ -BkM_0l xdPMNY[qs |x?M^93 EVJQVL@!He-eVVeec?;sn> FM=w XEtj^n#FN_\͠;5i䮣mpsm"!/ z{(!1oo6k|#_y%Uܝe/>3 zګJyOѐ{tpy|huwl"[{ S{m2Tʰ\CjmV@)+Vd6+Y`5rr]hj' >鍑U`4U,pВMl# & (~FWBlp9]SغX~`wFp@qyo{z9V܉g?z4j |`Z񤭻`'\B;*hH-3v4?cF*ؓE[+v=/ҴMi7\NHeM98-uwMN;?ǽ+n쳣zuj!:؎4HcrZa)NhQLj'%N'wT%sژEm R\*!;M%\|V!~kChFXI Rx16MIy{te4,[-}CesTtBsfeUɝQNOy:voGN;/>~(ª/i|0+4Fu&Q Ι/*ahAA mlF#,jc]|+U;i]f%]RyRPT@w@UV:BUd3dөej56.CM$4$)VeJj*CJ绰 {c { ,RUJ][9UU]ۅ1Fl#IpAl#)~5LHû { {f.QR2QR&vaoZ@#\TU*dvaoaF-SS V).1.QBQ(.4U%2f1:PU-YE1F(T1f.^Zi+)h()2 {7ڰ3R,QU|jTUk]ۅQZ5uwTI]E1F\ \Ojz7ڨK (0䮯K**E]emFW5,>T;d/yApŽ!XZ$AXVKnV+c<`ZR5Sv_G[Wjy 6jČDJHx}^/x[2Ѱs>c3%xl NH>G]3pc[Es˸iSuV &]_?kB՞;?uq+囶i5Lb+Z4H]ki:>S`X\f:=l-V?.p~!=~|+^jg>}.F~mf\SuwCnAB\i%Q55W\A;OnCzZ[եZb] iAES8@q nIF n*J<@[ŭٻsZM]57_?gZuZ8 ׮K]Pum+V۶>42P_ƮEX^+mu-Ekl p2%c5v=P FqUu!l# Z*WF:gUX6l`U:xVQ+h05Aw] am<3*jj +9|Ji^jgk[!oS*Sm_i6 6 U@+TEkqU|}LRmz1!18.iš֚r5j]kWjwL( ɇCλ8t6Hw]#S5uɾDƪW{xVg.{m0=5 aꂰ'4Ѷ D׫ưO>캴8B 2XIf-V>#6H|,m]>k$ *g\>fE`D񕨾+ApbH60F,_^ZeȗN b ōaTB ɇ ݗu$uӪ6ZB=z0$FV{\\D宖"~XV7?h.B+Go-I.İ|8 piȀ ˺<#ֆf WBtZbҿյ!0+Vl]Z G_!ŏQŏA $ghH6shAl\U%sj3Z[EY/nFd#dݲj6Zpmd8v9ڈ&e soY+EN!A( Odk#rUlY9+1\ "(dr)6 !ĭ$8 Td_ɲr} 0{c,!XX4KET&Ffɰr\Wq6aiBJIFra w8JK\k%O:o|Q.Ypl-Bg9'e*)@j(ԺfMD wrHA8UUuhε@Uݑ*P%\GKȪH2 @B')h{1cLDS$q&%H>|A,|⋶]N4<;}tv?G A Ȇ.(S^ e܄f.}쥩E/r}NÞMld:HSIݚ6uWmJ~H|R9G\,A(o+14tޖ}>a*#DI04 xT3E5%3xթ7ACu(e|` zpD>8*$2 J/( 8 UyA%i6D[Cʃ5TYzbdC.Y87FCs/o;oPA`4)I0 oH˒0־Pxsl`$ki K|+Cc[BdDװE SpYyԭ1Tf?j3wFMLXbMNVco8DQ!@LldQ}iHm0^U$ajH>JRɘT2NRITƆq؁iO~d8FaXm> -\f;_>ۙ;I1yBoZ#veaH>p`cwBm2ߙ29,wizpFFP|>@.8,>3}N}엇jfhu]sqM!FQ: ZF%}բ*(v!8EnSN1*%T@4gU^Okk`3׳wQ*DF@>bI 4D*^ctnUêx!x![>[^ou~ BdvJc.ĭ BBj&jemi2޳R^;I4 \B݂20H!ޫL 0)Wša#@UKFAw=Waj P8' q^iFH6 PuPԺ))T"ƭ5#%bH6mlI} k4RAw"4?z>WfP):$jH6V3M[,?y3X`?H0cHcqֹcY'[⦃C'=65(Y[C y"GrtE^.eSDX"7Q۩礲 F3Ɋ\ޮnɛoe1^ch06H Q\.%AV(=1d^TXt`$!1!or?}\A5HLʰIrlL*B^ݛoaZuXOA# M0LABhhJ/U*Ҍd#aN׹Ɓ/ޔpg?;-~g8 *EFc KأU8OXΚEqL su딎[ԥeH>`t`Co˚N$|9f1(7E!֔ #H* 'Wm!`)1%VCMȣ+HzH6^-7H>5+U1=NW ~Tk$;@3aFwC FQ7G3r}$ja&qeH>\!Rt\kU%gWG"Z$EYW)A F6ޜTPۦ,trhJD⸩EXKlmVOW_*,[x C}8K+*ԍS5*۠Cޣ62p @:-Bd#ek4Ъ۳URSI+詐oas|d#eg紙CՕt<~*?:""PBVgޕAW1΀ 初Mx>[^/gͮ|oucHŎ9y;ﰜ],׋oހS@E[PћP]./yvrJ0q)M_Z,#j|~|eTT͉j uVۏoISѐ 詋;q8QfK{t/'OgϧhTBQ);}Ѯw1 ?-. nUS` |EDtV{tm tJǽ=>#M~-/&*Lmr ޞwՙe GKgheu߿:;|&jGrKo.gVlݬN4oJ҂5E>USF!MOJFahNWٴN-h=T@8D Y$o@?%8MQE8a,cq02{c=hJiw"avc`~w80|snY [\\?|tm}9\{^ɱtHן̋l>9Fn z:x$#ٽፑy&r5*|GNF8W8pOx#yk4~+W+__'G=>_=OneW[)EloM*~IywDVv,;8(@}= N(@4y~~9LXv&)<Җy"nʆ'N)ϾL6e|T*~/e*,į>[wmH_a[,iMrٛdv1nDIrfr߯nI-ɲѣX,p3O"WPS t 2EҔ*(80Cm|+`pLtm K DǰN8Hԧ1mW6㾭wbRlRevJ;J+&|^Nd..+žULijIsc;7sc+H(Y<$0 ̟b4uv\F* M5ZUn2Qp#D4Ujx/NuhT~ּm;9>NaMG$U.3K S}Z KA6x 9cN]1Z6 jz$ᔩu'KTyrB=HD֛cw9ZRAf_0*@|(+"ɧj[G>%x|!MZ6!P)!N(乓m^rTPb]EsI⢂؟_F7y9Op1^3}.I] ,5g6}]`\2jCM!=܆^mNQK> >xaSQTFc52^ Wb41?v4`N?z5Ih\QK:|>h C8Pc:E.RZm0"Qs-fc^0\*t2׫S9j~A㚱<}o׼M~ʫs ~vRRXNzc;أd2l^>t. lmo%Cr[bSo5:3x\JL4HUQiNFX,OD}Z hOu @OuNu4GˑR]\@g{wS[|]c͖\J?zo2kwɵ[C ŒIs%HH>CsG|x)Mǣ^)Ah0n&_|`xw F+ !/aT.6"E #cVMT8)v?qhW'7f$"$`26])S$WEva&M]_Lق|1d402%@b9"ZX̣˲UO-v,K\v<&~f9z; 4_\1%ꕹyP-lLMS[; dm~&] S߫$ZGw~U~W5wՐեІsz4~p7ކ7v2w.CΤw/vU|+k{e̗ez`!;}Q{2n' !c>nM$[X/goؼIx_O.(Sٙ pF]-4VX#O|<_ IVibxxBhojB߮HSOwI<} 09q.?qf[J_SYmQM0s*H[hɦ"UG8); Sɷi"a9S]w̟K iԔbmyŜ*[oz;ЪXsCZCr1i>}DBm$&PۜW͙4̕-*oS>&E i$Z|30]FtW+\ < <,0 EgyK*Oլ5{tIa8 ӨA@+g_^!/T"gW/N~5ZF)H-dkOIe)3JPÏi53pw=^Mhc& _x}/?iA@KHǩ1E hL#I'p mѤ;4 nhI0VUR&3L NV hỳVhѯa,H .鉔&h%KCOD\-`*bN%:i\q}v-*s6l4f ZGm֮=gOhiQ$V E& ΅"Q홋v+]BbpY;r3z_~wfֽOD^DS‡vdʻGl2x,̀]c{4Zpho5zDF ]ΊaT~懿jb%<Yj"s& v[ea((]hCRzfU\´GtM_&.B#ȳ &1-qrSH|KyҶGSrs_Sv^9_ Q9^_٩ŏ#z:WZ*qWu6#eD먁P}NNF\sF 厓{ۋe&L*)򯹒"4,cjbe, Pe""ن;@]fF'p&?39c7m8QV0ҧ_`jXSϞk>k#NOs\+I;K w%U;| w%nF69C*m^K!WFi1*OnEZS*-NkEڠX7 OʨݺPż;O4S˜qS5go*u*oeT&\*z.RsB87Sb<SpK$;9tS̴a~͉n_3| ]9*04J yi,2͋(4rmXUn嫆KIZ v60T; .'ڡeW-H+Xj;)LvaKY?T'3i5rY;7)$^T/zS_e-5F. ~whP- PU3Qi&+wI^#:ߓ(֯A=+pS.` \2Jtmm:wT,WxQۿM>R à|dʃssf yXD9Ka3Ly#o-&$ݵJ ` l'\'=v雼ko/ba{&8PII،,pI &~{ZV)1ɀ}t ~8o3T.[rv󽟾vO+<ˋ~%InQЦ:9c}_~}^yrT9{fʠ31]Q&I_MPS7S7X~w7K?6>a8⭡O_Y?V\u;pb<ժoVJ^]:&ܢRuӡ3*/g*iV~7#ɿ^_U'MvI/ftaphBnӫ|}ۛ*_$3RrW/a.$nXϲ 6<6xf0uyWXǬ -ǭfJk{pL+[ӳQg=_+xĪQ'SfjdTq\$ri,[IJa>OABMk;q9hLK3g@wU 1ņ92C}"i#w_t,5~ yU`)Q;0X1[ %#EIpRD-HqJk\ /ӴDuJ9@-ýK >i^f`G;~VҘ爁(AB!nHa`+$/d!8C /ֺkk_(C [pЁo (B1b=AFƜ(4Хm8#PnjIa1=:qG6= lh@eR/禃d mH"*aqPOeP o-Ժp;G`ǨUt&*F\ G)!&КQF"= l:A#i/azc]Xn2J[ -NB6jy,i +. n W"(帲AJӜ77B4(%! RPe!Y8d54) A"QSW_rxu** x,'jsiNʐS DYCmdG PEDZ,ApPHфP݆45h]|A ( eߢGVm#S,W=E/ o|$k8s0񣈾daǡ0uP5rˋzyB'Q4 .rtEƱ 3@.A y\(Z!D}ѣ< _8̓ ,h1*P ;ZL(Vg%ZCcQ-0ڣMjN #ZooG[Y&W@/SG/jDEc,0fY0=!t cYhUC .\~d30B !,CG"e! ^(d1v8(k"-0n- ,<C1!tѹDbܩ`\9gH 35RYA$ D,?{WYrd B]:##J;-{eKy5PY} Qq=~h*q"o d&, pX3T3(.0͑f+8٩eE9/Ez@ߐPQT($N+S~6tfx5{ $#(bbŶ ƘyTU@!a_IcBj$!p](xI+9b4b(MU~ZOWN!RU]swիo~wVj 8 }aB;BޫG\zUO7z￿}=#TNi0 -|}z/G׸;PU/Anϼ{MWz]t3N&-1Nٸ{nNx{~z iw7;o+\}\CS9u8ٷ7Bշw}/HXKnu5{Bd;:ENlcPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPgu^PǚmZ"3&] ul_BߢP/$!B!B!B!B!B!B!B!B!B!B!B!B!B!B!B!B!B!B!B!B!B!B!BW+飼:lQpYPh qBߠPGuPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPuBwzpy q^}r/kp'fzڧ~_?_oI_] `_`[4kYX `RF[:ÏkWވ59~Dggn Fq)!^x6:,b]0 K ZX2EBJR,kP^X  H{YJBoJxWQ)mV4^MBZ1i_XY(,l –`ñ"{.Kڐ 5[X֖-Yz!`vNZ˾ l? &kxoAݬ:xV>m;uߔ{ ̾hl= =4?﫼؄lS8鬪͏CIZs} e&c)hr>T[8pV~o`9bU~Oۮ"& \|\}8o_p]?NnN9o/?V?ѪQ1tFrR<|\?=νyT_*YSl ?8 ^V[ XT5nڃ'žwm9&ZN o ``uoI˲맜-l?rQ֐SA ;Pڋog7{}딞 WDoQUuJODx6~ۮ/A'\a\kURͱi|mo9=|{~xaꤋrS,/xxƾxڒx߳/Ϗs?.:9 ~dy?swZĭjoӄ|fz777/>Wz7wO֫˻MeTbNcNGXcN6kŋO?\]oqe:~_]|_'4.g6.tں%|-_ '~~Yfty͟7'>6_?{6Ϳg|d0o>c?'gѻ1r7݇sdƷJs6#BoPѻlM>ͽ#I]fϷxOaW}%Zݺl^j%T Ѱ.γd4uVb7Voy{I.JH`ޔU,|aq1saOa9UlϽ#Mo(v4qPSsp'5J%FE̶#i^q5 ^)Z0JJ`q%F ŖbL,ZsTsԚ1nELMm9'0)ScJR-IcBzMx38f39\n]dɪrQYW#Mpf nz>i'2Wz+h"l{\I4E3IxQSJvtJaosˏ@p+p,yrh.s[_+_[w,ؽcmYO,A,^~Ә[VIgø޶!X/O!Ec}J{SJjm>nB>29USḴ6g !ΩT#=ITAJQ+ޙ@xFn[@1lz)Vk099į#l5f>GΩ..3)#!8D֪8K, }bmHk_%mSq %2۫<`>%i^̽U\aCѪ)[AbcWB&##b&)g;ƺ6V'EM#hdGoP7W*KIԐC`/Utȗ Ѹ j5CA)w'wr O`B nxUZmK rgh NJT˃l< aĔ8jͳm0'ʺhWPhc]V`TRRm kuʳ)㐔vٰՆ,Zgx@\i%%Ղ[yOA1^jz)]5<+VC$6LHpe6X=;m`x=ThJL $6u>έSe ¸7Jp&@!dA ^7ݭDW+!d`$POs_P3{A E!&K!N&$t6'8TR0 3)J;T d䖛jd&nk_J &L:$)nWXT50=ho$FRl5:TxqTW#kMڜênNJ"f0ȍM<$?U$!_S:"JN6h?>Grc؟ʍs=@niYf*jE}LLmRF)q L&n%FvI"yŬM=C &MBvp'JQ!&䮗V:D߻Lvv'`x)^גס rM`: ܫ<ƓB G\ȘW%Vx Hq-nlDzMv R6r!ikt4Y3eLF(8 ̂ BQTWo$=oQE軚aߠn$[ d⨾눧qi"B0_ ɘ̩ BK~a5+um7 2qÁ"GP*d-w"q`tGxHVldL8v=E ^"b1jVX~ꌥŹHYhvtƨ@W 3l혍5npJȀɪ謏樇pCa)\3\pIy@At`%0(H&P`=Dc'0\+@rU߰Y/2Gc$B.אy!(*bXS:91@wԆ?"VI:èp>`,E5\$$\ jr PuHa-\# ,Dy$Qcsr-6Zq๞H-z@6Qk$ ͍=O:&mH@8W+ {U8Ev ԱfƲwśR"@hS)Zu0kJ!/,vP(fP#, QZH`ڡCt,"SLt(߲fړAk<1 8eކc#fR0[0*0@\TVKcf]t>p*_f," lȎhY.2MƇPx.X|- YF2*toW8,坈FiDoQ; P Vmlnlټ0Cm]!J V8Zը~4(}e͛:}%f7[~mdzBvhbOn2;7}CzEWT㘦`Bͷmx5aXNˋū3K?h>cV8\~C")!2\v}wNSXnS}H}:"ZtJDקCԁҞ QG(xONeJ!|D*DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DBy*2Dԁ,qtQ[:%>/DcVB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DB)DgKԱDHqJD tKOq95ϟc"HQ&- Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qu Qw*RF/LNqj;\XEVׄ[f>͚ѷ.K2/1p)-%\} @kzB '_jڜJ<{t쬰p+!tUL URp5+IU|9x3ܞrL2d+DlEg*vdVt@SB?~.F$8.׋-79Z=C\݅r8kWO|&`$8_4G/*7I|L廻sJߜbC%e}ϡlֿ?ZBAH[[otXb#9|`oӲ- ;?}Vo?4n*-lJ t* g6'01QvW.[vs[V[>ԓ:&wEY*rլ%[SL*PZ/&!*mt4]ho}O_/~wor5n6 n\}X@P8rW45M?3 k@zWi0x:]ޡͦ}|G kK{Nj.r‹w߼'9h[s+ᬪuO)wu LRxrXҼXq-vߴm`C}U*I 0'2^,c=Z{3">^} zGVDC&hk(gx*P"Y%b;A޹vCtezԠЮ&Xw׆5 acvZXqv!zY[fK0_ rH y1}" +ꝕc L1Ʋɢ*yA% FUoSP/ǕvaE댻AP>jH>8-c~7eA3k=m9N@)Z2HZM3G 4 ύE/yUb0"cI=oCۅp nLƀW0y1$l5=_W}$'gNQu~ݾ3Nw?V[C>81Ve"j2>0)/O{I B߾R N@GC:|lGqq ~2_|^r*G ټ8֚ޥUڢ^sz(l\h/zI. T&*G|ITTRj͟xNrЏ{`Y_ZXaeq5;D&`Ыwiڴ[cNG)v;lHr:EԷsfa:v?灎쿵1.7z亡uZ«&٭>[Ί4v>ŋ]_r5}_\Iq6@٢YX5K|ȷKٖt[op^օjJw1p,ǫz?x} Xn (zxQ{hwfr߷ Ti1F2"k.Aa׾yb$v3v;g<`$|6mjq0'LsFnocUfZCS5>["Lf5v]Q_vAke*gE0Jn+C9׍,{uRy<4/>3S+eK kYYaIeMYr@=`CnQaUGSN5LKn7u"P)VƈTTJrB{ߵviG]WJWÃ`hu%VBSYńLb4{raG]WF?'=(n1 ˝ j]8ŭU-װMHT_S,8)a{-*\OS*Z0Ax+b]蘱uAǪsߵj]G]W;UmZXĺ=uU2I#0Qy)q#"@p6A-MGFH{WTKKӭab,V ,3]F<9`[ml E[ןMv<K(/)*Ami v Ԃ#gA0O3f2 c9G ryG WVxӌi 0J&Rx%"&Z&54q]MMp![,+z1Α肥+,,^]KA<*f"=U;1,jhȻC{˂k@ucu"<"㜊xrI<xE9s+IAi_Gj{~-3Ixu*J oN041҂q CNd A#aGI.܆u#,˅d{j,jOa׊Ls/ Wmp]@_W{`y2" Զ%ؾрMNdT GÕ` ؁aPGlFe !y,yH'ACyy>4# hO@(+t̖g"E,sɋ`!<+bqbnc{(.j n[pWK `vQ34q]#Ȼ3hO_r8I~ ͣ8`^LjuEdz=z RS`ϵ^4s.)%ʎi3} DdQ_xG!i=E]ct[KoP$Biϵ|"֛R´C"l4Y2R dĘdqe`+?Z.d61)A.rr`1.YFL["2)p< x {(:LӃ2+Vxk.疫tUؘ@dkc@u4VtӾBwZN?>? 0#F1A= ?AƘqu4Q32¸)6 6kLq8+dhŇM?9>f/ yiw=*2H8`9*sl9Y?DD"aIGcCcJsWY ަDp V6[Pzgm+hYEwV)uP:\S>㳌/!xV=]X-&:g9nk:gM;r0X݂HAȘm7c;$ݡν_Ȱ^PƋZc5#y{eXb5$x IX~}(Vt/Lsx( ӽV*m^-vLŽBtEA,Id4[մGO{yi_{⼬*5 }oO~i `j4K?Ǽ [Q'vv|~܃l+0xI?(-nGtS>o00Z{-PWzV/Wx=l[od-&[_ |fp_~/k+w0F*p qerO/}*(KjF20f(5JdTGfyGȭbd΃~l]5eN*apuDnyVx"tӷ k?˙(68 UXHGqV&620'cT`Cۛm Ʒ,`―WTťl/]CŽre>=f8'w_ފpp_C D"?Ow+lIk鶕!c;ze7y79xV\VHq]!'_ MrTt)A[sɴ=:b1OK vBWma<#pa>bJJ6i:YwKN`rNߩx 2@&M="\}PY1c\cǎ_@vچPMTzgf~~ӻ=t1|?*V_FpU" )LPR1a6m|%lvv8Ia6YV<<"O~5 :%XL[MmxC8uq[e}`ᅰ:.WLv.4aE ;+]pD:Z)DAӯQn{Z'Bޯo+&7qX_S~y|.U[FYw8)۫7{& S'Y:rˇ1P:tjKK!V^7_1o/: TϿӛnl5 !Xs?Kf?]Еe vl`"MvOo@۳LVX|Xz}zwsøyK)n[yU<E+ښ f9[=* *;Q/;"9AXb6ͧm6?T*1l^Ϟ k[TʔB)#@ROl!#):ʨ9 xԋ9;B6l(1}\t|Tӟ{pkDտxwLty1"YF\uErW&uDwYX3ƒ %W0q0kz,3GrQ3LIȈ(~AAR50_XC-!P,a!]paB~-6ܥGwbőwW~:K;"2~/OK+*\NIBM!Ed@TQ]A1,=ΜMf>S+dާ%>8rv-e [yGl'/jqw_C-1>K0E4=p1unjCsGx'YЌ<9Y8Ft ~~dP1u=~"3ccr1#3?U*/*JӺs"8"EM`Pׂ^SwNuE{jh~g)6q+G[Q#>7LT,@Sݚ^x6/8#5;h1j;B+.=F>H3yAS]7oƼ!@ 3NYm~zQf+ʮCgסȻBޣ~2L:6yO9R- 2Ƭfd1붳urO4# 6/|BG3,$Bi"֛{ב#ѷr_s`1OQW*ZRUW_?IDJAQLQվz*{hLjU rOR@cUyL,j>Bl˰a!ȧR?6m2kGItw%/PT {aF 4v7GweXSт冊k"EZXe.dӚ ] 2wdz(5757*\wd͍[:kijc*a[VЦG8 e0esP;:B3VhxWF}o<З@cПa96K7PmTy>@͙:6K3*H!*Xzd?GY9k1`ˏ_Kd@Ƅ _Ob8(ޕ:lם<i9.q=2|^̄ xTT;7Xs|3k8U"O_[D R 4\wEʑ\ȣV4UpT+A]Yԏ8󟯫iqԀzksyD!C LUE6ZMG-P|tH^YYνdO =^.w˕G6/vk &UHB4DXL9Y\49N#E%xWY|ۏPMfa~Dp(3*(£dH,J|Tηʟn?E4VSǑԆF/MsFNІ}!3bqyKiZDzEdT)@c&@]Y{Nȍv^?ompPh[,AakI.DɅ !de L؅8$xW| G m1,/,mv_2C(1̬Ʋj 3'Ly-uqlwe9뱶f<hF5%OV 𮼕)e*lR:  (-[ 9.p>fTHŸI>.5氠՞2 ʓSfCcgspTmnxWmf"~c*ÍU'Ϋ<GG*x n:Gj;a^Mbs]r)ΒQ'2Z-g|%p"rE,?9lR458kxW~|,_yh=@c+|8X"T_ R`HIU¨X7/<]a3 +{\EcxEk=&>g(8'өrqZYguT״._~aޑφ!:}bd ?d6$!W$Jbh |$cUR[9aR)TH2aK~HCHi]pTc ,z^ 4H1S{U7֫8B+q.[ qʬݧ.?wweNq)$MR9)5ɢD(͉ ywd*52q>.~y6]?Kb8boU`gKl_=jOM_$qr̀Aڲd^P#Wz{Z.QO@bٿ.ڷS".M6a>CI?%%5G 'e% ̅C |zy,O;S#ZY*;BNfHvwZsQc[l Y+5oU$piM%P]4<Ebpet,Of౬/J5 {'\WߗUB0Α$!r[cv&czrTɗZi%wc:OˑXE‹ֱAN8 *:5">:+y3&u5< [4 rtXyZgcCz7Y7m=EVDGI$݀,X:XGņ0.Ԛ h!i+-1:G#M?e$9:̣<%Y>\wEp k+Ot)ﰵgEl)n T\&Ɛ>O.搯T,g8qx 9F-S_4"4 j{Ox?ogv ә.-(a7Q9(E_'=vܠ*⭘ƯKKt( GjqНt2JtIm_~ˍZ _*';I-O7E FBr\(y7iOe2ITtцNx%Eߟ^gU?p׊kRvx'9Uw6eԋA)b&eBN2%r_Ô^r<ƜM|^UJ!ql7iseb1i _/*>B2u(ƶbe2c ~>XP2QwTp*<+8P8Ɨ ʦ1u}砶T$$K);Qbd 4/e8_6}) {xRMz:p/2H>-4\p:j3X",#3Ҵ-joU֔q45z $c#Kcn),V}$kU%nm>71wh=IܝbV?8%=O f!\7!m\p-bO >.H:q e<b6PU#\M2IX%w4R(h(Ouc3b6O~rG8hL؎錦Ox' JEnr [Ԕ*/P#U'$fuvQ\MEr:ӅL %K2Vfì. [_AF1^rG0: BJ JT L%ѮXxIWYI2v7R\]Jgb3H8 4h4%Yolo_0}LƋ' T/ROK$߀eOlWxpag৾@Y_2#[*Mn^4z!qFt27n" [N vFNCYsqd[?s5rHa TD4= Kxb@`s1 -*ACh|JÕ\dpW&V'm M$HxA,q6ölq1'QZ[ nB>[z=09rQm G }tٌCIRYGj$S%Q$~(ȿ?F5}|jrGǶ%pqJpDH@ #RFp%ꂱ{MmQnߙ{YzLdJvq`10?ϔn4s\Enz6 %1NmiMs9~T- 4ekNC|:%j) /dA"h 8g@l='э=ʳqA%IlݝW9rM)uy%3d$v /ǛAv֊iۈ(bVRo"8v>Y# X?;.(L#YkrZ,A\lE/)}-BXEq %ӺŎa=3"XiĨ .DXѸ '-+v?Gq>KH{*!eÿZo;iD%ykW#ZoI}RF;cZ/[Γ0& Z ~좖1؀yKݚ^_t?$x#_!xbotzLJ_Z^s/sa'o }iDu|M> V]qN;ZR|u,ZĊ2^2>(9T"8N)~XkD!vC%~HӾ%q[釖\m&QIB$ n+"ŸQQ*F8k/_r20dMq=.o yAMWbJU> )1Ye~) 7bbK˒h9{FIm$ (IG*>qXx Iir\uV@uYa]jFm}k p-!pc?z?d2Zuq<1"rEdlR49BaY|J*QCmkd[S+_d#YlUHS²qV{W(*V[p%ItU/9y\|y],ar</DZY\{>ƞ9Q~Eu$ǴPL:Q4dNv1:$N+`zU.J< Sv{҇bhg1-<~GƊ;6k$swe%"B^:Wz݉ݱ0Q݉ĎaJIX❵ۙ!;EF/>OO |T}vK`d`:4Tg2elRe~r=ܛ.<0bY/fRd!3E4M]>tR\vG/{+h:L,TˀH|Cdu ;)Pd Rᖞ&/o}37م}㽯jR s*Z`SvcJYGt=Q೴݁aZS9`ӐZE#s''ou:OSwϜ?`~#tz$!% OE04A90*uf|J>!$|zV[5߅~JQVj婼;*JrS%ap`.A̰B``͉׍$.e㠿w#(#$_~9Q2 R9"}!4,Zo͔F|˜{E',vIPY`0IMNfFҸCϬi}>^"&<~hq]%S6E45)\X`r.$0t6D3lcm&i쌲ͱ\F0 (6n y 6g3TrxKXr)niآWKDu[2FY?bxg>곆H ̖g8$nQ=C\0n[*Oвsͯb˭1p!l7iSQhrIA(&19tgC-&yz)fYF7#qbu!R҄:aQ~zG6ats# EBRYi:kgc9Z2 ss44Kl#̄vT.?cb tc9~tG4ÒoI$whl+"JdK$OM-4+E8+Q sjetZ+E;p.|$f$L3}h1N졬 Xrfr}'e$b~?u?ٵ6l;u̯ ַgwyb{h~לUzIAe±PH ж;Z* +Oiqm~F0_DaATC?ˏ@1fm#=&!yB)p)I p26B۷}v}[JG7pQ֒1QSMEJcDiiZ 3pReP7>* /! /ِ1 Gs[c &4/Af(qGjfLI% UnG%w̌gfx彋m.^NZWI\ ʽ)P1ἢss>}UhUfsDCfs 8V4[2FM9Nu00(?SĈ$Hk4{jØAe\TJ9A".yȹ) r;jHߝVmЊ1}IEz5 Fj3"rHҗh^{ 71/)\S'vS85Zcw1Q KY+g6$.z "!Zub--atFv%z3*=/עW,o`a 8VZ2F;Bř%$vR^;X1+/"Tp‚\Z*a[Liq.a.^$̺5@iI@#Ғ)/ز3m1kq -*(ay7[.]|:p#]9!cK.\P*0^8T%A'D@WO)ǀϳ ) gfjӈgń$T#ݢ$8U0VZqT KBQ%&׳jp?gXìE=^X<K۩x(6 }cc}fs"w';3n"9S4ٱ{bjjR?L\9OA-zG՗8J]VєS}GX:qf 1IU sŵHFPoaHק_jϻ(]살jFntŸmv1/^cmmuaN.Ï8898ץwmɳv1 GNx96"\թCr"'`*樲h^ "1XdQĿ{\<}Uhfo%r^?<,Otr&cU8Z)3¸E`FBc,䵬>EsE 8іQ8yrQx7g9[FsE'q8*Y׉[7>6+װ%W rT1MILlK&c 8>QEGNJJWbdP,aO.3+&c)kD98mhH&cTvPT1o\ӛ3R Ez'W%/I@z8qJjdGj?"=#H\J&F*gEn1'A{ #S9jW-z͈nƀMQV vp a`yB\LX*'@Nw'9)>wzl;̒TI&G2 FYpuHQ9VO+)hx ,J☒!&`Ήp&p"zj ΟX/\j dK,1.Agt:xo^OJdi}HԈTIMάIeI9>pn7:yd!TwzDũa2z}`2s6yz$OvhTodmno\mGLUOo3l4?W.mk&cD`q56"sgsxKT[7_>'w1F)PIӅ#ftϡ+_U̵oYf2~ `.۵ QKyجRZ]d~|{Q.LmoV;ݻw zo'#|(j39A(7!0Ll2PW?t%(M_/M- Ǽc{0e0Ah9N2 ;E5T*2<ԝfe}Yኬ a$n{8q7u_`w]}Ә__+?6~7OVo^n rZ0'-=d$/7%sJ3NeIpM<Xzr¬ķ*Y>\avI {Y+ ?aW{Kskm}:]Bc}k꛾^O'ExrWNv4k{\r<ַ?ofvLjH<T>ZhwP`YUQ^ۊe{ftzmTY[^UnRn%9|qSMS%F0$䁒p2ZN!7A Ȧ3[_> 0{s|&Z#Z8LK_z|*#*;$U0_s<_%t\],XBX(ЫLlNJiϔ^ϡv*=FI*Iq@?gҶϱ@M[_rm_W]r=f^h2$I+n: ^Yx$tc}E:YCo`H1/ XN:ۭ3; dgTLf2ԡOo~4']۪&y4|ԢڏvQ{8SKEzQF?p+G*s%~fcnS2|и' 񦂈*'13I#˗=q4nR|:C|楼9.Ie: rI Af>hۅ90zUQoV_5I3x{+ /a&qQI-,s4aNlп_t7?2|w(5fCu5 R0?ُ?|ݽK0͵_?f e^e۱.`>w܈^;^WAՔAsJ6Ϝ[D'CgX0ݩCe_|mB%ǻ6_*/Iø/c.vql>mlce+9bwKjI5-GbL$Hߞ*oٕs(z;+X%Yůk=M)|Q8K3-zW}6_nÝ61.#}D"H^Lu* G3JmJttJh!j JQ!Akm Aj( /A`n ^y#2GB9 ރ tP 4\s}gUet1r$;^9a^:iC&B(S#8{њ؁éwf(%ETBuS:P< Gle"6)R8AɿkԆP6A!2ґP@H44Uðэ}m!z_򷐬%Qʑhn] _yR尪k&1;j]~|-e6*ā}mh>E|)|`u4]Þk@þ@#Vh 崔aCykyLg[X54މ* *%  UQcBPCJR>ݿ+0( `<_LQvZt\w1rdo\)Œx픅bTP?B!]B laqʧ,@G+GX&Y\&V&u*2R SRp) M}򫥇=&)b=2t2>gC0=,-M}?(2Ĉ9>lu | |).HA E|Y^Cxҳ\%_S̜=`nd-u ӿr?(MeQC,L"UBOV9ͧ})W9NȎ:VN;U4nٗ3X=*lsRcXE&&A68ؒ X[uKta5;Ǒ-*vU2z&yF:,Hb guu(bmz`dT Ѐ</E&W }|קc`a `ЬȊ:V={0>-1t_"L. %4+`(ED"8,!,FZ@h[*.+,P:U!kiTJMO` ڤ ('uBIKdNtJ%ܮY)AC=ގ,t1r${kV鐻T"{xLؒ>Bg]`7FB4RQ"x$aՠx=4lu55 TQ8* ; X(53PqlJIa`f?ȑ=_Bp^ƻn΍rgh `5bIa lЎ@8ő8Ԉ# v{|}~kTu|M͎J Ȋs|T/yۋ/3j\18ooFuJfU.̽G6؅4)?\.GrIJ >Ftʜ=mU6{UzQߊ0p.{Ռ|MM/^ # vYG;XswDJ;CLqTXc: )gڒnܹ0UuUkfǡZ.C׀SldhŢ?8QbD:܆gS Rn|7EUZh:܆_O#_ҍp50Ct<ٕOSoz뜉;ߏB5?ol-Y";ga#Y#|1ڜ)c?jO+vt)eɒ]}0\G3\C%\NF:jt8'%/d#TЃE4]yj-02j[D9LVINxYpjT()F(DL *w7Mq!=A̩c9Y;] x;iCE$:^ f?%RHG3tR_]Vt_b(܂O{QcQK '.YV%w;f{( ȨcSsh 1\ r 9;vy WxIH)l;2ST8&Hݗ+"I]F}$u9sȪf[#ݗe[Ywؒc0Mٻds5~w}s2%QwGW߮"p^;xq? NU\a(X[ϩ-}5 ܆&OS)%pͧ?(>13FipE( {d)9jrD AoIŴ2_zՇN7]+hQ_m-h!(%' !;,D*C*D*˿jտamB ſFW'ohVVqQ'0=Wi)iW#W]cwuUO1 dSPf P a)iٹzH&bs[#`ڧ6rPX@@^ǚm 5Pk.bWj;k$˳87 ;Ϗm48i"LVr rZL&[rB[5fsQX_smrtM:A_+ vB໶zRw*JK|nVQ)ӴߤTxկo~>*]qU ƱuZ|{U -i:UQBWo5zZlװ\A2b*wW0>1ljKЇ::忮}s9"EzܢBq5}.heu\}}CkRz gJ)5HQXזXQ;|>u}h+2B։ m c)ș.8M4Âfk Mj>v~˫xs=Yq>5Vߝl^y?wDww,J͏;~ACt#<_r9K8^ƪI=09gzY4"uA>E8/[ߔM.,p(`DO"FBFK2YhE2b^׷`kg/ghDB  ODa lOG8-ށ)-,k9"?<.ps/k]z'}vq > _'C&D(P"‚1+IxQ42k)ZwY:Śn !&2EJh-]w9us HÅ>Bʷws:Vr GtL I6ؓqr|9z9`ZZ˴&ym X SWk|7N SґLI #kۖ^OQmr>J/c+QXHACKǛ(wjB=NC*j&&8a9C8FQ艍fpR9*{{qP_ϥ3l0Gdedvg\>.JRiv$=y2]}SwٍٴY֡G; =9I׳w uh(tTUwug{.Fף;[@(h2_.nI^/~6!Qr|Ho"-sΜIUDk\2rZuLKFڙo4 jBLQMaW42-8B#;Hq㕔Sa'^cFJIN*a. %wr)$sa=z[Gw%C"-f;uh:ߜK}_!{Bp'QQ),9 f#_0-'}ܩ`zB{fBp:/<-3.5eo~vHagLYxii8ljX[$\m⊾]ˀxҾ2a}pbD;Lv{_(~/@v&*0=M Ξ1XeM<zprfg=z&,#w$d} УB#@,j_x-m _ Ai}Hk1&9U]78J8$2hQ%r h#W M;RQ*y]@sy*6VCּ}Kz,+ڷ/6WfeVzڐac9 Y<prRCa`g!mp5*uosx4ƯXI{W 4CgM 4#zjD9n)χWo吘u!z} =s36+=RY[wHT}h1C%nF=3Up`|[g=|(kmSlˡ)9˜".R0ٻƎ+W3ARݵ/ vZu`*E˩2ՁGQ^m^ƫzԪy'?lCoaD&wy`?g8[n}-`Al]ww$'1totN]K@zũAhOz1eH-+-.*#xoS_]DLz_Qo3M0C9-*ͅ_r6*G4l 4oY}4ݯ pVj}^ɟ2o2c8AbY>pH.=yab{ wo{s# 7{?!~3ؖ܍xăX0>Fbu(H9/L'z5iD$ %ig}7_pO@ڸnkY2W;4uQKL=4~ ­~؃_sl lc7ji[GgO,lY+/OoXICm^UiHڙ'SxjNiWcӓ$v6"ITc #CLTLyi;"x4`k>9:~~w,Z bik G8<0 V DCY8v%P޻% Lԕ#/f00p=VX,Y,0Ĭ|ă U}>'5^:v l8f(qk%Xf>&ESpop\C8zDXI\~h >WMP܋zc@kܺ5kU(?&FNڪ1!hkƀhٻ*JbWl 0%UQrN[v6 gyku6xUH[# [- f|A5hr<7IO$U%'5X(d(mINTnE$2Zuq[ QmT"{v([ܦ&. xjVMc4 P @M2 :᪎-zeBVWWWl%4QH,dwRhIU{.fz<]EB0564؍BG2[]` ؀.N(B޴h*%5Q=ګTY*;=~ԈL|c󠣬E1,~g켜9$h swI;(VxOO(*^بaNQ.}n֊2smTegx0IFӗMB05r; =ƙK{h6OcDU%?^dQeOYW0MmYG ˠb%dt}~A\{hL3UGL8kb'&~= C$ ) {Q=N]ԖB E{hõFp8rX194@Ds ;B)4Z<y%??&lkF O9cQF}Z?VI>DR͝ȝpǴ Ln% }F~:w+QK](ewTi^+q{iՀ]'ֲnc+3XMҭ0Hx($J]~u0:)(M6 0Xje۲hI18B:趦7ȣ 3cgKB\k$JlSiE1^W-t:Dڋ_1.U.Q(J/C18YG= ֿdO1Erw>k+Oݕ.ӧ3UٝFcpjrPPmBk/s[>Π(]k>*E5FBNԕ\Ђ[v).LD4Y8u9jH`$`2iÇQBLf8lZOQ˷F54E< )Qt^̰C ֊i9E#cEJL C18 \Efk# C =a" 9^.o7=0e$lT|<(~ kXttv{{%\_C18 )퉁(s6#[ŊpD$Θ#~̩*EZI1‹n84C-8WC2h)"M@HUzkC18Bsw  OG#aY$JHv zFcpXg_^/G*VDiÍȑD e{?+!u Г<5CON2w@.i4G4OA/rXYyvM*HFcpn!}:mJk4'0'(']_r%):hP@r;,(>^[d[]h Qv<=k5>cscDv r[+0Co?Qa>Z5;9`S6vhLRd:dW]4NR{eҴzU^1.YzQb{- zYLۙtpp.% ALe?HX 4UJPKm~͵:"Fcpjto{WɡwDU^Tw/Dx8ZH d h@qj u7`A+aS]ے[z5לq pW_ LdȻ̷"͖>l.{j#]D3R6 rAT!7f2 ?~d@_~Y}x!7bщzI r ;@S DeXwsS[Z~@doNYze( jl\d, +ת󯌦ʨ ( s2ѻGM]P` 3T\%D!s {`p/ p sUhUPW>hFJ7>vd&x:Cנoͅ_P֘n4,,eQFV+тj:Y\~q*<hC{fxb4E&ynFҚWן'yE:7On.MVDGINKv#i  ,{CXnx:_&!NZGf4ɧ$yi᳾%yxN=k͏&{5~_4K)B[+R!Q1HTc41 'o3Hh!&5I\aqֿț~.no盔#ɿ2Зf~?;; n5,OkTHʏ ߯j% 9ØvWW׻MkI,?xyz `ћv蒓}WP95ܓNC\|phnN/ r!,AͶ}ݯmtUe6Y`%Lcj8&Vs53|ZvN9LS~k|!`8 -#Vy^ Ifñ ńCHZ3.9X<&&A&_&s=;Vo vugfllN2:ǘ1uL:x3 x+ ZϢK,NQQ =p⮖ܵ6AV\i{xi ȯ_(p:;ɀJpó.2½g\J%Q3ukIXC$hӇ XsOree H*LeJc;%kK>p>fI6i;!x 5I NmVH>`DRIQ͓ Y&4w|}=A8i䴣=0w7 Wu!k؆G#K<+*iu\-x|pP<~f)ZȘxY̏}@mߎjm)Ǔwuyp| l}{r,,^@ogezi0f`LqD_ue(b4.L3s^:e_[huǖefm3Hgdؾ7L\*%9$;HZ-t;mM=s:-Ϊ/0~ُ_盜o&PO>wQs_8kìp9m/~U.䝳o-AfO2}&wjV֞<K|Hg|48U{/*׭?y]ԋ xZ,f2F%LP67.JV^UC}/<,A]4+.ξRjJi@Z3ǚZV)k;mvM&[TeVx&H@X`%Z ,yAty+ 6U`JƖab]OT{{&n"Yʤ‚6B9]:L)EhBKB=s~qan|klD⍇)V{S٦*[ =qzdMۡ O htUtUa _&7G̿*/'ޅK?|~&Q)M 895>&yԖ 2eV1ቚHT XޒoQۖv>OnjyUk#5sTa"Z2#@ yMw٧(*z}fIr|+)>tf7xFRo#XR Nd1uǟnTxތycW\bQXG鎿p`R/)r)b.h$.6~Z~߯&V{3]"MR\6aTK,n=XRP+ uELϧW "$[Jk'y`1!d+!Isp\&T$BG-1 gEv@z: tVu(6?p5?o&@y8$n]',jS6uZ; Us1i}F:ƕMw N:tMYF DL `6d dYNǬ( q3) X簲l[muNڬsS!d#o̅І /r4 9AM2&: )X hXGs@@a%h HF:dݚ̻Nԉ̻N;5]ɼkB9(kͱDed!Q SFدmwҶ\='*m.ږN95]==Af҆<`LXfFiJ1 8Wgd)H4 /sf5uNPwɚ<>K>K&u(Z+IӔ"5\fcs.eK9xT"sO0vJuNuŞ|}s+)_\A~s;(5;,nayړS}O!P06wRhfuڇP*tїgZ LiedFG0j-{@6Wvކ,kC hw}t ч ]Of)Tmhwnh˅faV~m߇emFΊtm{ ؛Sť`=:[nu^\}m'/\3 8Yi1pN VK#8%;? cWG#\+E\bm+zJͅ9"qu4 X@+(9tqP6*PWfr+XIr4 UXBoj.JMNsWJN+XW<#VCW'qŕj.Ja×+Gj)^:jUE XpĂkeZ抆 MۨʧdOhIHI-2%R;"=s7-ױ['iIUۭ0*ZCJ#Y ]ݧtL ƻVh3eq&ow"SM4K1IQZ*).7 nU*D+6 dXS<58lkhESsBTxJKgɦh(l06D!Ĭ`Sh.(w[NYn3k Rѫa:c Q#wk_N(d#Z`ŏkp-;/$"Z赂`ۜ t֋D#Xa9gFuԞ2N_D80oh N Ѻ?,sͺ(*Kps)ҥ7'EW.VH_ ӵib^!dTLeTBRC9xN+ADApF¤lHUpof`Z&0Y7>R%SVl %YCU-Ɣ4Ħ1%6)Mc Mȭv0B:%@tZumx8I Ž0JC7YzHڛd3EK\e0=x7ISbd ;hwĬť Ib!tB3bU2%dVfӭQ~^&J8ꠟ’}̓ 8Wɩ PuFi'Z/" 35^rpL4L Iw&)4rX>%c?{%gz[_ &m7更`w6CfL}$;V,KJs }}ZZeIv :tS/"Unl'%Z:_p'1_p'| (Ș4KP ed:B ֤4,OgUe?矋!:v*]˘r~?˚5Gg_Lⷙ|,SQ*n+~6ż_f wڊ\bЮ-hQ;=^ۡi]61>/l(v/7쾥}-6%(/3ݏ WE~U~z *npao & ϖy臹X 3PF:;MEo+`Q´%J{LܒU"h2SL #?,Hh7_͛3,6l̋>XqUlWY*;jG 3^-KR W]Q/8xhGr/4PJ(0GpJ+%rj8u+'TT`}H]+aZ !,0.5! 4)@e-|tXZ@qB.dX71%QrAzF R D8D g{Z 6c-=zH;$aFbjUŖIINJ ;9/>)[P*3cfFdM>=S~c qS"-ł@} Hur< /7S$IrJC:;^ p1di=[NTznv'h\9%!jNaΉ @ͧH6"Da6QZ.xJr|tx׍n-{(&Ttn<7K]$FMoԼf: njIw{͚eLo/LeE;b܆|<:=n':G)}0rX! `O0gL$ b;Kpi і̥=Oarr+Ibl˔0vסL?]ňsYV_j}7}hQsxVU_^ғsH<2Z))5NTib5J9JG6Hb{ijE}v7THyDD:ftpe\?MlfP.*Ǐx˻`0Jx\QS_͏976Zch g@UPxF-kr} qV:D'T KGOeg?nR ޳%)Ȟ$sRlYSOE3OO-AR%To'ϊՆ~5NaFj;JF먬V ?yT1uX |ƙA?V67G rj:_g}Ml>m6r> .Y_;>E4ֲ7=nN78|UwZ<->vx.buk+%6WS&Gv\GdyTOæe#WGVWbcY^VL-"BDUQ{Ԯilo G#;Ҷ]qUU$[?WӻdЯtq[y)l[-q$\禀Z.=t(  hohQNe/͵Cq2i-lPAlά IU:r[UPqʠ UQ$/D0z ?T*nRCYFtNz\,EI{o#a) Krp[E|E#]tXb~KKriFG&) $`pb|ȝz "9;]y_wiH-`;~rw;hj2WEv{rQՙ#=W3:d?dSbPppnޚE^o]/ۣ,1k~o t0[ xhZ I>ϬOSJ-MftE+P0 Yc=N(I]k6۰JYcWd#}n+rv}j#~.PmrP]"RUiODX>fp:–'zⳄISxRmp4ܦ@ *Dˠ9P(„0mޥ1@4 `6`c) K)Ԫ9jNW.T'R}ؠ`tqJb`D|2Y,|TeN$3KJ-d%ѐP0Y1~o*HS:htؠ Y\GVu0b4'JgD))Hp=&lS{[kѧu+cOEuuȺDn\I~7Pe,?̯V!4ZqJi=ZzŬZ bFB`,'@lk̹}Lf% ;OB m%ceI|F ӳIKع$䊭erVr'z b&iRЀ^YxY_o.rqY i¤ U'sV 2fс63b^sFP4ivE͆h!G2bLe*D{g("YmBzCF+dra0sϲ"zTNYt֚>n;Zl ~ df)RT0[R[4UON3l4+ش2Yl]2Ց mOjp+ hgCy 9kb,aƢݒ贎\9[ǶY}RpY%7CUk~=Q%;d <ʷT\9b^<ţF3-Yw.ui}]S6uow5հYذz Hg œe;X_#t '$q\ T; ^R8A6L:pPBҘ&fej ,q<x5(1@͇Yы acUb)hp7%4Q@kb}?h:C#v:m2B wTrIzh900tċJjև2m_.w  n7ncdhz0m#eKڲ#O'T5|LA,Y2=s6[͊.s- q~T?J-PB]*ܽ~H }>#Y\5R=$ zZ?J~AńP"qsxq._ _|g?Nk9^P^>ђJp$^-n;U^b?zIdꦼw b ۫첣<]wt2p[ v0/~h#֬]pfi jSWy3ˋ,donS@:KR0;ӤZse%ڑ2q)SB0g[>^g_M>i]antkˈNb}hl>R56"ŋW |6m.vEf0 /SCIw. 9v+ IxNbH ҵ݌.FtnsSad+0V5@P7ݹ;&Rt,(J, iNn7ϒV0̔3$#,Wucak858H`AQ*YXJpjwz`Mܜj->hA6WDX;gAr<)z  &$ փg>;Y;ܡ0[57 9R9B.aZ"0"|0 1 *b^7u;))&,ڎ:.+At =ytl\}ٍ3<~]:_GvK|uyZ|FCb r!LoE*Z^9icɏN|^ 1/z!TYg14ƶ068%xH<\x)zM;1)c^uRw引<ʺ߀ή oHgǂkBgM/4#ag9ekǶ-,躔PV.'И&z# >2hG8ZKyj|Dj=,Bc^~Jh$ȿM~˜_ V9e%c.^Wv⏥{z7XۇČ'K}a>,)4:I beFB*oZ\A^$fK3eH*: ѤT)\k[TH*F lNKP7=ZIw aVzoUo~lx}=j*eUNŬo} u_PSJ@:y_#gl :Ors cee{cxr v)DŽ{\4 SO.MX?>,&ӍoNkVU ldوmM8S9o4*%8(cґ'<!#TtAP_Y%BFq]V|R 8Xs1`sf ֣%[hXw9nk,-o<=,;pyh3^s@$#׽ؐzu_|Ҽ=ډ鉻3"i qGkf-=zK`,v| J濮UrһqBG'BGV%<_]a qwgw@ʛت SRFW#( UrצӋ⭞^d[IM:TRd(ȤJ@IJkZcvEG8qw![hBxd-YկN.>Noc1AAL1G=J yzlyBJZAkIlQ@2YLÄ:1\Md /}Xqc_ܲY(ln/eKg)g; O/^8.r>rFjp "B<&4s% _;"aSS| F6jArCbxX?dMIBV$=*JQ}*X :TLdq88#zΰmroTm^e<']b< xbT_Oz<+ƮVo<%Ws1ƾgr* \}9LJBP Ml?R^\]#QN׶Ga{V_L\X_= lvQpYKhu `]&ZU|mM䨇 xOv`H*ZR$T6;Se'().e_J+fB {!6ؤcH|J0@-h@j쌮MbL*Rb8Ibb4mҴIӎigaGP=b̮5;s-NlAhHe1wʲ8NJV+֠/M-RF(֕u@i GY Kpu =rekr:jd7/=x Iƒ9K6Gx%9WmF2œ&1R#B]o_?.5yGa^3>;v&pV$hEaܲz"xU()Rǧ o¢4R *ԬeCR%&JAѦZ41O;C/K(d зgw%pRW$&Vy Z10ZW:]8+$V81ob6l\A#1b@#+hADBSr]Q!uܭ;Ju;.Zսa֎ LY/w%>~hێZo> fq֪;f>Y븸e]Os-ivp6%υֿy%I7%x#[+Ut6u치[޺j$hOrr/aH:y?xVsُz^e=D/\\ʽ huڞ dЋx`FY1 i%s _-k3@*˨jcgvrys?OU`j}7!3۶=dfGO瘔CLQ֜!,xSXcL2fVZi0@/p}?W/bZEQ$EEm:hat壸]s}f#bY-ZYxF$"xmb8\ςjm [0gfȗ]*D)f rYئ&`gLTmvaL >Dxz C(T0 +ª^h!K&t r~.[ W@6(mIf$ϚuVV/zoo@F:xH+T1+~D =G0VAhz%{Uu1-J{ke6T̶T L.Bntk:!vZMBF^'5+&* ro-:$[x,;gdFћn@ǴCxu$"K"꨽%*a|L/\o):Yl4C#Z!<ޮ$<Sj1o^Ee@-mRAÎ>-d9F~D ]'@Qf/matVM:eZi}o፵-QFCm4% VBbІt~L =G癲)Q#q# Y <) 1-ނR݅O`Tf$Ҙڼ9CCx'UwZPB;""%I CExaW$"]QrLX%3!B leq,Kf|FVhl[g.*~L '7el;p8&>@5U^4ǴEx!B zMEKTj@-MHox:/5W(H d0`ƴCxkaRIp[ `vBzϼmf*#re#elJ KzE=0cZ!<20/g֮hd8>1zChzo][6VDc2_Ķ)!وK9??.{oQ~klr\뫟VPu0BWf́sm ˆփ XK_bU"1?v0+!eewHtRq {5PMɂ8 r\Bv="e/%۠ ?B8^yiT;WcZ >`_e[ڱ N5$/X$( li \J%FXqLN"TPƄChZkC3xeLC1B|Z+ٞ~L ]Qs7Zfl[(dt~L =}g)sjtn3?{X1O 0fy eSHVI===A{lnܦ,>m9H|Yr0D1=Z1xԮ7(s⏐ZBp;~z ޠ磆!ƈѱвnB"k)]ߣ; L Еh~>ec4$n%U & mG>-|VJm{[Tߕt"G \J6i=--<rSsAZ)jtA V$G%qLuW>-tz/g/xJ_ KFqg:q}]\\ָOʽ+q{GJ>j* wlu Z |Y59Ͷ45?Xؗ3HW ֣n>Gn%9͏5?S y;V %++Uv>I%sJΕ21{oY4JΰN/!3B*aQbR ?m*ʔIi}ҺdX0⭓ESkRT/=4FΘTXXF`Iv-V"JUA_2:KCWL>< OH@˜YPmR;81};"(Ec5T\SmGPW/%dņ1``S)IgYW.޺SEjYYCf.MjlR{ D:BLu&m&qGɧغug wT\Y+#&{‰zk-:wgW/6_?n6V5NV+qotCI4 l鎎xaBذ܌x[`iq`VU Pp{(}-N*ODK$e*[8Ŝ-z畓$2ZSvEٻD!,1yPlj )d]1g<vntZl&dJ6E5$)li2xw7JG ڷ'5P1U_CE@SeBLNL=#՝*Z.'r'ͯ7| Fr]4J\=<&SEo3o29d66p3:[R'f)/jw5rNA)ɷ4!Kbt%[:_Qoح;-vr[M2~.ll}f_*]Zl tדNMi~8|]Ogobwl"Q))E֑d1;ĤUa(lR%9߃=Yl9k6؁zhX !LHbhYU7ZF |!\M {hw˗ :t,x:dh] j:١MN NO@\J=tlÆi vHM;k ;Rܟh\֢@DdUgpұ/yQC8̶p8\8Ri):ܽs97cgC wP&DDҊ38ln޲ï_<^:.&)¼/DJ:[VKS.28 U8JҒl=;r\$W94{ppPVTRݼ `brud&ads }!-əKâ2mv0G0S IUMqCv~DV Kkk@1T%E)fRxUUSkgJ1^kLjLk‘ӃOXe!v=vVk?s.SGi1jjtL[:UH)[60u9Ќq @[]^`2HZYH`ɤUz ֝J|F_-&~ހ27XzS0ㇶ]!`gy-L1g3x]]k^v&u3y?E,[oy 2y32yÐ_wR,{#aUReR cvHK[o-VQX_%!v`H^{E`fEr,5GUB515TpĀNYZ1@CP7#En~m":U4xv8e_Ŝ:T%ׂ"RJ8E;.Y^fah[/z`wG(Q,|;,g9hc_+Q}j |杆 7 9bi$F7 >wۇl}&|1N9]Lۏ'k%/M};]stvv,芹>렊)K:hST[6͠7'7[y) 탂aփnЫGVA)^Y)N; "v+թgް: ?m{ظˇoo }*ʝ{>Xuz RaYHj&iD./oK4Ci21S«wa>VhN[m>~p#K[3KDw 7L#}cLOy-BL"MlsmݹΛxJ+κu:׏wtr@QQtq8vZ86̤z)jH%ٚBZ2dYIru )PRC8{p&T͗ٗS)M6ν0`_t\!<^G GNO<m7mÓ˼9J6,tUN$/A(Vf.R}Cξࠨ>VczD5([b4*jmt9e7[dEvZi(,2tyUh3Li4Z`tlD$ U{E!4Nxfirb"Ƅ/;&b%"Ģ1 ECMkRLtd$cg ֝gt}77a אUo[bQj3员U֟SN=ZJu69Zw%^VS͐SNȑt6aJ(W&( U &#BẒ]HÊװ5x芗"B:dGl*%TtI**䢩*uhoj\XBWjT! N`2P `j+$M!eȤ Fɒrs,onC}+yz{:WnZmͺ~h 1YdL[1>r@x9)ь蓘2"Ȑ lr;,{@(ZfAA޳,Aa4}I.~v^,qf#9%~I9YbĔ}HPu*E)l:R6eɧgGo {ڭM'O?QT.F7狯We|1#?y/?͓eޑͨܭۻ$;›X ML-Ml $D2)J*6F:_uHcLKuFN!c.ӵ w  n79f ێ_Z؆nSY[Nr~zGzI现M9(tbwZ$EXb%85BJ-xX?u㸆 6ݰd~&glu$ed2gR8es$"SgFJٻ6$W?uSvW~3 6hQvodYR>XŪ/"3РuYuZ7!RFY16Nu`qIEJW:raN_nnc"1MP )K$)P5"̓1Yn82L}k` LFuҌَ2kS"m] !^l36=q6l>I{MlҽkBm\!K ^{1n~9mv>]40ﭣ{._v>W^:s>dכd}p+y.3;БKOݖuQOϋ`Ly(k|Z5\-SR,En56?Ql.!.,pdq  Jˉ:w.,%5 rprWY`;㮲t]DWwWYJVޢEBqWY\E⮲Jɨ +P,kQt~V%ow~*w{ ׆!& J;mQJMpm':#ԆGfnS m w{=X4ǻՕY&c58j Mw7qdWo뿦~0 ٚGd٘[K만/?$|-JSmEx #њ ^۵9}+//o* _ǧO.ߡt'3唫MY`C;3ţvfr~S|RԈ-NB((`qWY\`]qWYZyIR|[tWJ.X t]eqYgU]e) +쐻kw ҙ,-;{tYtW5כl/'~?ӏ?͔Zaǒޡ[MVD(~'03hx-%_ߧuԋ-ʹf"!TS"ZqACЉQ*"S?DXQy J=s x/# y 9a\hʲ˜R=h(VS~yG!þk*jڀC0hCB>UToS+Zh&8FIAR~AAm $`<2cDGA .-RH! &8 8*͓᠔HH^E`))#1E5Zݵ,+Xj+1u! יX eWbVV,%DHOHCsݯz?5 $"zzՠYm8<>WTㇱ)#91f2-\vcqtyLNxu4:l?rO-|hTuҝޠ7CLOI#60"aǙtND=IċrD h(7y#8ŸZNѾ0wH1qhyj(--ș.mi)0SUbW72apv NU60)yH9&p'knuŀAou(Ǜ=I5F2m av0x! gbT3aW;NA4>S (H4p^;[ZzQ&iD&jJVYrSUhfZ\(8St"G N,Ns14}ren6YmNW4_nϗM0ntrND4! <&刹ewR\@D`4d@3C}dZ4$Τ mQ娀i@H"N҅9<>wH$<כmi}//|VyњكK&Fp1p]MxGl$Nwէs#n'v.ojNW5]c8GYenݜ-%8WY7ΖrލqگɈ6_5-NޞKb.;Kt~NAQAtt׵24AK"јB^p{~_E݌_G>6BH࡙n%H kNBckͬu sۣm<~r ]AhTې6p(W nx(©ZE~v Pq qv>l7E܉87OIXC8ϛM)GHi.õ,DM4N8ӤLh9u~mU4h% PRj"ȝ "tJT@A!T nV<j+Hm>'@!y^~|m3ެ|~.)ٰ-+;˭nBKm\֎o:|)| "Hnb<&U4ɐ/JUNJjڟLQPhdMBI{ 1Z+ssU6zgFb/sy&XF'5D *R| "8GKhe7zCErY#TcFxI.R-U1%hT[؂߂7W`ǣpT->}b;n)@sG\Q#E+U>l]1x2>mtmV^ڛh"A6`ltA,F`!d !ky! g"&( QZNYR0P@4EI@Z x5 `E(C5iJ#AT:/9åY^xw2GtbŻ~"*} ` s SUR+)XޙVRY\FJ*K [Ie)Em%&[IEœpn< 5onTTiNσSO´:Ey>AGpV72hbDMGk~yk *OSd"~k70IJ֛i|#ACb۠KUu4_JY ^e͆dԥƫF6*fFZ~٦StWep5VRʘ`ZBڜe*my-Zշ#߃. `kXZmWg uc=*d+ōRJKPIǃo&&P=kI"^6-Jo="/{w7yiˑk4/f9C7̙٦LƦ@KSMvc.އo7mlɺ7ikTSM論yZB.>wH0db JnNXvf˔fԽ]yԴ#pR8s6oǍuT4k!cB%yy1tJOPj-x+MY)i /gbﳯ}@HdfifaU' Ll/ q=TKJa0u6#\qEaPe:`a6;?gzWy KݘRLg fU۴*#`"DtYjhBF[+>RԦT9L~-ivz=ݡ_UO~*|~i S:gkjW>WsK|nEj_n'%&ۛf0~XדBk{b|uO!ka,A} K3f1AϫɆ1J^AuX)z8] 9߇#\ VV5 ^̿r3S6K/__۫Kh兟thKɢf{Jh83W㞛6J&0T[5īAn>|?{w?1Q^^ ˤ#I^/ZmZ#s o/F9߿/ ؼ_w]ŕ΅Cjb16HǪͳl%sc(顲Ѥ k1j0 .0mSg޽ z#(wR֏@:kP^! vcd͜cNQoI$1x1 ӑ<ڄ"uNݣԚ(ExBrU-GhEqѩ`!Z9x=0BL [S0W[.%uRՐNՐAvdjȳv5u8ZA'6`&q0 BCǨ> |)L^W &uDhQhcϛ6cC{RUSi/q\]A~J)kخOGgVv֑,-Nʟ~}ۇǗ?ŵlsJx40|RP8S2xgNu쉢詖Ʋ萰RaZheH;B:_IBk V"4z7!B*Caj3jX$"﵌FMFSf̧qiy\Pd0ݕ5qUa"7wK^խ.>o n xDpQf얦p7MۨV! t6]D[kyJb;,iyhA&$^c5ba4b.d 쌺VIFE^[Uod!qFpGj4GTTCO 4ӷRMQm_,x)z 2 @]9=h'0vDae99vlhD=@b%_0,`,Bp] !;9ttqBDSQ4 ru6'8\M'~$;y'8qE+ Xc}6*K9Q+ĩDFzSE xS.{ol\6g)y6}%S8 D 40-i`ُ9F t8>rd/w9Ctn=ASwS)0$!%|ߦݵhO#SEDUF'U,r3(߭hN?ZY^+f-g>I󰂺Iȭ*lKw\q LONT޹Ŗ2%H]'5ɹ+VWW.Y+ju/x6*KƺJԊWWJ%:u Օ+qF &̥zj/u=t2*Q;gY+ET5xZwж{w?~rʜk˫I;.L?03皁H^rÜQ*"[XcI"mCiX_+jbÓfbIk#1G}:wDŽQ3Yp^6yO{M"iQ2HtX}09'o\J a \ ^ШJXԋ^Ջ)%Vi!c1@ѠG8bL8b(Xj;nw՘t^SE>^Z20_TB6.,bZM}vz{䥽thˉRgA-!е+849R%cF1 )@,9SG7>p-}61mήnZos^l #P-{S,#,|>SUB[bC;uOFI5JqyNW=`Uv+rfVD+"p&ɧSBN/L,ާ*8TE$NeKgG|NP<PtE7ːԩwE.(?8 7 *ނ\n.rh4*#wi-9JdJORC/oxYBBx`C,rj*\p, cr)oeV+ĉ;fJF(c| N9ťoW12#XP1g&ʭ]3nR qƎp^  *Mnөj-un^z _'O]c3C,&x90c%)%#nIƝWR(? JڤB:`^ew) Ӂi]I;\jmYkNkwlUKc*8`aA`Ӂ{ ҘgTHNeL+ : $H|łKs6&}Cքo:I$CJYK"cYM9+]ґRH*Yi3YJjvYJRZ{!ay'Zz#k\q3~{~)@0/K"X`T9F$㑅H>Ʃ$]2<)c"ҀS}4t odg)/m]>7'\ycӭ¼&BP564UTsc%bD<8aNxQ>$g'AObl.Z߹0Gqaewa e}^ i/aeD>8B ^0Y5a]MmD8eMtW/ 'z1 $ZOK2\Q8XE!pE0AD4ňUYE n9^ru e YkqZsq\x9_JpPCƆְ/$8B` b4D-%`sifb0p3" U7ch և Q"! #K9 Q 徵B݆Yr\Ym&PbaII%E|Bsx=t4- SfX>BQ,}9L[kDl4<0IM!Ҷa5vΠOxSFYĦqd ǹUO~fd\ PfՉ|p8q҃oѧb&QQ匍(\E-E*JN;ĕuGi99ā.5fr Noq8 ^W]j5Eu :Qfz9a9>ލ3/7>읎GTpEAjsq\L :P I'l`T@ HRs&s4+Jweqc*h%{gqoR0#4'(xAc 7rF+H!ݷ`li>ۭv`_Hŷ` o>/̽^k_1npJ]~hǰC5-ԅsc̱b6,;kH)EВ)`ЄK*v쎜'ռƾӰ|N]^D.!g1T)b+t@Hm LLj9+0Rp4bz^PtR@u c^J dsqr5˶cPDPS;K襔Nҳ/dXP123[>oS_~.T{0IU(YA ϱ4.d4p3 PāXL9>&"9V%~0礣!ߥ4SڭUʿlQٵB+9_MSRxl!GH2RL s"N;A==Zma7kịUh4BkM̨3h``) NPB2=Kpo85\=bGių*%Tæb[;#L q3VkQF^|8r `t˵.0#f["geTle=ZY@x!:w3 G+<^y`FS,SH6Ydd:ƔÔK>#ap)LIX8FҎYR  tn'‚%ȳsoilp\>]r/'J`N<|&mfCT魛Ovwݖx-W?;r5qV^IAX`&F aG اZ( `9$0=)*oMvMwD@4:&(fE `aU' L= *C09'V}ja݇])IV m ™en?ִ 1K'頩qn5QXM0J#$GϿZ`~Ő\?x`#_Z$?5\#rfɇ* "0 Iw%a( y|0>vO*Sy߬RɣT7qpo*EҬoM&zvQ_0PY:r{!7M Z(=4{ l!a~*aʜփyr3¯_'Lvx.tsx7mCsTNkԮR,b2tVFK]? y+<'!:+inqn)y_ҥzm'\ ]R In~ۯ `&ӻ:kX\&효bTp?)߀o|ջ/*槯Տ1Q?{ջH`$E>}e_CxTCkDu{QS[|B`.W4)uz1[1H籎*&1Jz4)o@u :s( Yuy4H)ʝ/)oLoQg+7F(9DG`hT\j"eޑVq3o Ec-{N ʁb d݂:ނC^Ϊo< ;ϊܣ3wF`M:^m94V@:rR0+~|IQA)3ZX+VLLI<*#Ϳu%zW.b+L[arMӵ}h4˛mFZ'BZ>)7NIӗRPW;鸬[`ouZC8vaHٻ7'9otlꙢ"HF.d"HF.d"HF.d"HF.q2WDTxf+[ciWp~=}܂6J!^ؠLaL )&t풅,fgֿ}+(Ӕ:H`eQ5b0(kϩW`''&g}^̪[)arX ЕlFS.wnssDHcu6?2l0Jaa,,8S0Yql O/Ź^xDzrQrL,d AQ)9@Zh ұWtH9DŽRn&x9iWґaYqGk\bD {#g7: ]hWۂ " zL˻|; غ 1O1r}dQU'4uPYXMDÃ%Iz w^ ؙTFuJcL7v l- `R eq 59]ӱOAeKHr@K uDt=,:k#p4<((Wky.~O@4"jxTqPHvYI(Ji`+ H_p=߰v)ɳRWS)໧1̴b?% Y"'UX@0R:ʬV [wQ("7P=)Q[M*Ffd j0DPR}k٭8㙺~ua_օ҅+1᳇|ac͋_7(h40׮!Tٻ6r$W |)awp{Y,ڎ+d߯ؒlٱ8lVw]b !mKHJﴑH1QD&! f\ ?:%UMmZHS!("5َi<+ޠv+^ԖQ[=0MOEk尭'. 0 P]*pSךŵRJ]*^׺JP ֵBߺ74Zw'E[2sco5` :tBZ2t1TF)1aP;bOkGd[U%e-cY|ޕVdK R$7;g8-(X9ݍՠɚO{Ճ98E./BJk%E($˿SJI xTsRD$@[@d2lHZYH.IQAFl]8ۍu-M/X+0j+ Z;^hKѧ5eGeMw(,!vgl  0RrD$(SP]*vV / f>pgR>vYa+kQaIJ-tZ165u+אs-'ptAbx{fvSuy](,(˺Q'ŁlŬV* ,&lvlf0&ƻ`o$6}Td z9`o nbU)Sܿl}4,fx@d}?M74=|޴mKT4/phK]˞e9=z5vu†53m`htĴFuj1ԩtPR[]@#l4۶jӋcـrrB=dMz߿b7 I"ry{S]DHfZxcjֻe{޽fwtWO {x\\cH 꽥mƻcJ2~tx`GN[2YW!Z}}/zCEMoInkuǘ^R ,ْBZl(O)@)!I:{@,j_t4ӍCoٖ6=V!RS{8g>=&_NwpvehedYDg|o;o<*Ir! YC՞Vk_0Қ:(DI8&h h]0ڛb6PhJbC*dk'$4Pz Ȥ1 LN(m0JP4&l_ǫ%[>w7{.Pko>7u<|{5qFU3!xmvV&9va~t|=Aꌱ`M.K23RW]p:A SJGΣuFi8v대L.3M>=zae]N@NjPH!6~FGo\]+g4:uu"🩏xtt<=gO^b4F4Wb\L{lsYһkQS>Ai>wVdž~wG~߇m78k[>`g>z2?Ͽ|4kpFGaa׋RM7N~%pSE>!K ? |j෼!pܶ |1>#i|=|{0} Ѧ^k۲"5Eyᓗ(':e) @RspEe,)$ lIf7Ao?gJ6’y\!bqd"nH \OOҲQd/PDv6{6&6gnS(Mjߟ8ɱWO3nV.zl/00,6%X'gWGkoTtk|}Pj14EYu:ʨ5PNI> $1*RPC ^xnֻsII\T =%$#e_ DGJ V7fXhD[,\o]v|7kO)o;2Μ~A//ǟ/ϟs#6L9u*i#3Ñb",5D0Ёk[6ϧŬ;TMmZHS!("eZxW˽AV-P{`4*yA h[9WLId,AI$6k#cb73J`/q59)bd82Fr1Y ;b}َSlwXq_~|/"ƈDq?u m/!/7&uRgTI6z KV:M=iNG'kyֽZVڧXg3-^\ԍqQ8^CA/](nt62uƗbau3j ۳ 2#2<&OIb|u{NEFG;F4=n3$u$tL(c"TAUʖ``3>\A6s飨v̥R#K \ ]'zoto}4@u M 5`X4Z/hvggguGQ5"EI ʽ#g"! c(fxB$F,1BPƹIB M*$ <#:jFrs 1%Am٘E ]hbfku![y}nKr\^_t pl=?E| X V^ʛ?-k|$G MFjy8)_4 |\6,;)BfMR*@dD7A1I7BΈTihyqI"ɢsc]%1Lߒ4! +)lOJ.T7A3\+{Vqח-r%:h3zY9v3[T'TǗfIurQ|$*Dgew}Bk>Nnoikҧ7(} aSMGz]?U{ (Gc !$HPL762;W$Y(ךY ++ߣGOy)CU|wDNR6!&p(WAmo[=í*bo0 ?>^aon9Vsӂ4ux5}6L?[ v뚕,| j M_Ԇťmt%fqVAA*o&p]⸀P\?Y4eU+u ;`ϔeyg8s8E!XhLzx" 0D"bHP!Fh޵:>{8;P7CA vPqIL?* -@ LqE hHZXZD OL3E9=0p'52:Z)lsJJMd@,ȝ "tJT@AH)1Z:ӎ.F{Φ=By4~/X5+ϏU/۲kzzRDO*O&Xʓ5T cʓdUyUyB [_27{kX\￷`D24IT$[Q >v&iWs{(XNQ42&! BJɘRM9ƹ*WV.<: IM3fRXuU?Bc}(NѶMtOf,A9$W*0&=$7&DfP8jd@Dp P۽ igRi6r:᭶&H*ST)ƄܢP%Ol9%Ax<MRCWWpk}򋭱+:G؁aoݲT1ۋ~9MKz[~y#j#O7)D ,hYhD:YzTי ٙGjBJp&rnBPMpY/\ l< e D*52^ Dq]O򡱆ָ&-Xi#1HZM>J/9å/?^o8jz^Zbuc?^.: 0v,J@LWWϙeD_\,uyAS45E)[ٹ?H'a!#CssOVz4Y{XVY$-&h# rkTD%hDRV1ОrnUM\h&6F .pg5^0 w)e9[h@\>f:D ZqaMN,cχGjx-oqwK&D&Ad8IARC(6zfK#eQ sIp%^ ?-s=:. L.gJFf JJS4*6FXP"Pc8%Y^GAcb0NO{0a5m\%!'>LhQir&SQCUu&&k/Cgj ܱy7?ִ cD0M4\JUxm0&6(MMHmpm':A'H S۵Wr[>ZPqQy\moheb\IoSG$/Ss_FYWdL].')G  ֻgLfWO4s0J('솜](9̖evrJ,LQ?ԒD YG'$:N(O_bNn<8\L>4! S*nlHU•};D|5EJ( Wg~V8L-OmŽ8>_@s@[Sݺ7m݌kՃoo'lqm?iп-7]r! TQV.p8| 4&#xxaX0@h01QyN>f ]._'k&D7Qgli*'q&#y`oK_=o퍦GuV \Tr[Ow|w̟O<dz:C/k4_Oce]<]3Ut; SȩNN|o~x_??SŽo^&&L?w"@lU[CxCڜu[KISnE _JhOM~41yz),kmC埻2*F)d=TuqF߮@u 7}"~h<hͅW}H:gA1̀5%b h2\Z]jftFݳ{GfDcy听s$yhR!yho@f$hE' CYZ9:ē263{RrEU xU{W}ܷbT=mΰ|u'ŧ`XrrMe)8kdalHE.bb E ˛m"H>~3Q\O޵d^+W֡";A*E]fy"nuq rL(y<.yb g58㋚$=b[ "ĐL3ޖ )lj\ Jʣp>E eVrY2k)rYK!Ǎ6letX;!YfBxXd8iJpzW2)mqC2 JZ&y-1Yesphkc$h}6gMJLFe֝ݖq;J9-&յup_pz_4?Lɗ .7σd-1ecϒZ%$5BA$sRY(Ŕ*4Ra]"S>r&B)A-pM&wYtI:1e^՝;qŹ<]K:^kՆjwvħ,0@P %c!)&I Ua!`H $/:cDȉ'K#(:C*2$:+jmgNS1x*|"w[+9419u4YpAU\(,p8,YoTcH"sYH 6Zt`DIsi7 PVպ"~Nɲ):IklEg;MߺbIklLX2g ~feuI2FT/T/TǥxYo͑^KN9HCuT*oN~}x[2]݂oQެJ3ocCV/^)AD(,|8!e"Lc3sho6]L YN^\"w*VՐlfp0l2h+V8aRRTRo$g.'.A}P7 v#E\ѧaǙ&}<4w?L.~%6߆-1U#͌\G|t (ʉ20 ]9D>G0XHr!%e>ѱ1gn^#w[r'w7K/W'9&Oåhl^1T?3w෦hXm#M77Wr[ˏL^u]BCšI준 Br RLp2#p]\G6q^|{hd77kW&8#f#y$-&I1ɌYeI ꔷ.;,fiqV/9MP|7GopjmnLpB#⮱@2|W[]s*pe`@&|8hb쫃K>u C I$E|{nǧyj3sBty,}֫ɯ+OQaVlY7ޜ-d;y `M Y 0-Ʉ6mZt,5-=͢s[]/g )fS'u-zuydJt+Y^ZDWiL:)6l f6⢯oA'oҴrd垄;KIo! VG KfhżU_y,9O/wNQc6!F٨n!05.t.dBv.䉺TsOBh)!%C X0q*s8%rG%2TkE""(tY#VEB^ &WM{SE]Vwef-/}8ƒIB)-0ofNA b:z,@3*JϦk鮀s)Zzԩ`-4+ `jH3B뱒;7̈́SZJM]dx#3J6h-/浚RƻQ k8Iƒy kc.1EHFr sQ3EDT68ø-m-3 ٔx6NǙܣA!Ocut]ɂ4K[ֿMf9{-cc$A#A%$A&AZF-5[cU2T7R9I|I\TpMIut!&4لBahmwDpQD:3(E`V9rBLo 1J;6SaI.GCjrY;*AŰ1wPHPπCqoQBKYy(%!lk@&aRWwܞr+U~6z*]\KGynWK]*=cCS+|umppvկ%!o}9KJײ^&߇[>g+i&kbF"hL'D%juIIܡsIY '%Ġ.#BGF'njFƂ2['[ `H#bPEԻW Ns/Ǚ@4L4픺GSVwle}cIy%&re\`tb^Ы^.e,X:wp|b'퇚H2p8KNXLRgrc̐DR`9&d6N*kõ.WvA+P9f;Kz?P 7WJTDDҴ7~ۏǟ?{gZ{[ɍ_1)c.L6 6$-|ھ%빞`[-[ej`ku:}=߄B4r݈y?ո|>X#j2gttIơ g+32y`OU cIHe?kuHĈ@a]9Y?㋿OJ;sVI4 { R R5yJ^8CM,Խ  I+;.d#ɋ@ heF{xvvMcD.8UdW*晴S׮KQ=/)2ks3[& #rB د +|h)Hؐ,br UMW[GS_j톚XmbnWK9,mK?__`[a.eM57w keUD:uhHNk9pkZhkieOHT:XnqIEJW:U-(1"0Js qxwi&8uP#e`TYiT³qryAh0ݷ+K+Ivn7׬Jagom-.A~|?ěV/~rQ{avåQxKo'5`evǶ6ri\ờ^lnm&B;4)suF3>>؜>Y@~mk)wuˮk'W~St0|Ɋmh k~hn<|ΓPB`J(]JƨʀGWɼ-9Orx7K=?۵61r0FֶԄY Vӆ{?3xڦ P.5g*Oi Lit_*>X1UMRhNͩ9U4Tќ*SEshNYEshNU4Tќ*SEshNͩ9U4Tќ*SEshNͩ959U4Tќ*SEs\|a[!*QVEshNͩ9U4Tќ*S4ͪhNͩ9U4Tќ*SEshNͩ9U4Tќul*C]`|#$X =rcn8'qQI.Qe5tJ\S a||J(+0(7xHQdD@G$a6AQ4>ā.{RY-{q=se6cѥ78Ⱦ5_nR1q/Y+;:;2( 0/$g^< Ư)@ m8KACEIMuA&>)A{c$6H< Q"%1:%gIE!A$!\LT&$3⩣Fk49"g blfBzf8nrUƷ^y|{7S|TyI)#J99fC1ײm V1Y}cBudKy5gt  y*\<9 m)pK%>zL{Jtp3fp.RBr Cz^ q,tJf1*R*MR9W)zƎX(HY,Wp_XN=&Mn:naƟ/ח{GlBL*W RD{MjsUET^qVRGdrg#a;Ƀb&&1n"72WRG)s6#\.;vEmZiE`9s9P'MT* K&DG0E0iA9F )(Q5!"sbDsZb]H"AuTbl6NG!f` """+"bEĵ[yW!2R0h'\ YL0ύ1C@ O$j#-FSBGgT,eq6jeTAc$͸(҅xuOszɮq*.V\\ʼnE8:-L` @R5}bcW<{b]3^W  rc4͇)=(70 o?tiݪَ5aS\G7 V7!Տ9(\oؾ?WY+鞭U }Sin_I]N\խt`|n3N?M O%mR Ӑ?GCm x9N]L<\}t`=mfݬ'{,̉#U =>קܒ?uG~lvvً=0~gu4O?dZT$oN8s,|w~u`hxw:{ū7侒gCllWۗ +L85}åUR7c9W,Ր~K^tǛATvڨxrی3AJ,G1p]6AP82#Y,m5D>Gщ^/+2MVd:]_Z_aر݈Px^D?F;8S`ƃaRBH9&p'4a"lr1^gաD?~I;oBىuθhER#˗4 7៭'tFq7/A4hQ^55kgkPS Ԃ'=-xBRU\jSJ"6MLKF7IlS",9r'Wg8~G!+Kճ9*o F2 ,c:9NoXQ5! !&\F(>b"Pd @G&e#*GKSKW-*ff'~`]pށ3grՎœ6%ܭE>kfvZ iMvqK]jDXK9{n%db8Rh"uã)W4.ZU6]`Iv,T!KMTyDH&iF U^W!&eA5yI)Ybx[31Uj'4$i"gCIqЋJ<6ȓOB 4DP2(uQv̨tOΊOyCf8 aFw4+ 3!-6S!i4Ɔ [>:yWsKO믛yĥ͞|-0 &d;KPxf=.!=ˮbs}S1cy5R_\َGJ9/3eoo6d۫,;~lv~wcab yWk] XnfEYZ-W\-gVǬy8\;Vu7vljOKo;zvPɨ il FmK`\x⤢\Pni "s~!Z-, Is ]hb{}~5 3^xgc~{Ep2x`ᚋIgvty׹cfwMh{CŇ9a;RnUxmh6w7l/S* و줲]ڔTYJ5+sB'F?_[6I%a5>Q294@KtxOvއqoIg='m_J.w[6u{cuM?P="+Z1m#;Ζ> ]/_JA)C&3 m"A&"XVO`QnPhdkDV;M-O0i2jjY V}VEj0gƮ-fwU-Xׇ~\tzgf$oX ȡi3 4DT*`1 X/zCErY#TcFxI.Rt*G`ѩ'[كYp8I+2U *?SLV52+"120I(Kx, ,3j@G*)k` rdI˭4$ҚfKrNnngpyCNq!yk{QiE3-A4Ob*_=_N8}8~ Hk9c4%6'nT0ƹOLhZ-ÍRVԊf'i9ƒEuH1 :Rdې%哠[cݢ0tib1|N߹z| u 92wJKN@'zkmȲЗ`R``o[&E"cV!fY(XVzǹÿs udV*`^6; Pwt=U%~{'`.G^ 70rck_~w1 r.^'_E?)jV ~=;z30\ZE8nkg+k%,d'cԋeplY~KEֳ .5gb:,lN#NdψFE[F|NCE{2avͽknm_םדoW;Τ铟{o f5{ڹwhb|WLl=pQ'`Dp[것4Λ)#.;Sz!%#G# ]GsRj=R %zb-**I@KQycuk̺0;Vv7SQ5I)leKkYRrW7w.MӡOGo (yy>:/>Gߓq~.Q9ץe qԹˡ8 N^-=*暈zTvNOͯ:̥B0t+q,h:fej^{ g ; Sdut;_4|>ZF? f-Wr CtNv:mm_C^G+..ALl%sJG6fƨYk~fo?ݭ?wW2Ea D'P9JL2s:j♝84hЋdˢ2 P kjkN_uCHvj@M9e&kjB4Gk[Mл豮]wǵi1w <(܀`oWyLd:{HugtY].3qH2G^9{e@咂k@$h kAs/7wC' }nafi7՘lĖUjj̊N:u%-%/\SBsXLRaL)2ǛTxo.s,Sè/t}K^oVaWYZ}x:].}tj(qC:N eм8YvZ:_--3)x!A v9[)y.Ur($3ݤ"pRȡf?RbDW<ȶSSj24d%%qq{,P~xbeċ8}(Xj.=`+Rhu:% FJry O6J 8Z/ӯ&ټ|z}kŻs;;;m'CdghXw(viR~Z}~@x1I#u Shy8i- -;K6xp may`)5 k t Ybx"IRK \QeEw^[Ĵ7u#Xt_1~} .][ong̎`ڗ\lҾ2yi_r+_ڴ/_M.\0Wc1$,g0VS!T܊=?*n_R> 8;>6\pBߙiRpqŨ%1"1dj$!*cQ,21RTl-Q,B5 O` I&V$ BHG6*텑"0A('|Pd/ ǝH:6G1f<FdHx`LZO Nl-O@R,Vj%J$@BHdL꣢}uj-O3o ȫu,Z Yxdž)U*)'@qǀƐh&&K(H}F& %`c~iѹml򗷱YhuW`y&'3s=)XJn;~pK"|h8y+ToOq'LxΪzY}z[5D3D6r u].!O۪۪;Z8{t!WC=gw;VON'/DJvꗷ~KOt)U* +Zl<RE6&M ^mҕ *VNtp.D-ou+Qꚙsgx% [T*S~^Vd:^8r,'G D+&D Aj+M ,U!a.lH̒r!G8}lSfK~n]Bx q^.S`fPEUxf_F?' N%7wiQLmIQ ^". 'lown56zK"'zݩ2AD*ιU*<BW XOU|՟_E{-YRXxq#!dh`J(++=ZxuJf؅ßlaRvyI!rq&4MҠIud<WkQ|'wIP2{%^pJ)ErDmHZe;х=~V]*eN(`/YH3v _(Ww!I?]( הu>@b(JZ§3K^0ϗ&r4㹬ߐډB"h&VnGϗܫ@GWZn  ZZ0; O}@q[ -k|߲Ʒ-k| 9ϧoާ޻w0AB! cJWΐ "ARm}l1WJ\fΎlnٖTXN(h&)x6R"M J\ekc{#W^;wf٘mrmGiM#T̮߭쐶 wS\Ȝ9CX-@'Ay9,Js,ň"sT^6*3΂@F. 5ԡa&EKfLz-*Ubٮkq |/5f}xleq"MW6mdD%#W3OWx[SKp_{H&RIT%('+\B!dBiAƠP!6Ȭt+  ɓԖX6$ Pc(@Oa \V PMYaHFbPnbمR/rofp?+,oC様l@6>5la `2l,>^Y-+efo}myAP he M)N'}$w:Ȫt6 2=*r`[\;bULGe1SέcED9ϥaRkjcƁYc3*D&AwAMHva3*+t8{RM3&8_bgmgL ӲTG;r-,MXJhFqy{@gVxJE*A n)"JB|++hf`r<*21BIAe2 04K4jZ g2kjhcc?zPYw/|80i21L372XnB&`<?t0­\GS]I8F$"pHŨ6)bh28|ބdK=1ITVIW[lN3)wf49>gz8_$@E}& NB_eQCCY.o\(RMP!ȜivWwWWU]] Y1gq|ּa ٫צ"'M1YDRu򁟄&$I]ll`ָT(-ҸIٟ[K=EX"0Bj)yOe2ٿh e SDhp&)paG7r BbF:o޵@0! H ˥CR‡70 CwZE>bi6:?>3Y+Ky.M%HQ}~k$ɭuyV`(,9\UQ6&u Z(=$u{gBh-8OBY]<ke~zjyf-!Sz밈zl9CZմ)f; +U-)%ƗtV]<|C1dԸRvq1imsx1CjeoVW&0cjN4BFR_KU_ b,xq}䱬RgV0fOOW__t) Mϻ'S, gIc4\e*IDk*S8,י^~E~x?>oO_x:}7߿0/:*ޙ_w"@|=]5 MS67̀o.F]vyE.vESxXe"&'=f:Z:l N3*M°=ylUcEh*0wHsIM_i?Y# a)J4s9E%(-F'<H@R֑VRCxK@ѥ,kQt(}t*eV4^̮ ,9̕J!A'E=>w:=峈{bH`=jI<}ʎ~ n"!@L8#z1רnjM-sDÃ%I.Ȅzsw^ ؙTSN.u>Q'a,r VRO%Wk DqJ1j-=XxxP42u3ӵ }@0Y,_wO#"v Io-ω ZSYuv{xԩu]QQ7_@!qfaHB9U 45AH\ׇ]yIe3X !ܖOt[~~[[#T'pи6P!8j*\p, cr)oeV+ĉ}[(Gk7=TMGm7cL!յd쌜͒;ҙ,3( 9V6p_pzzr[z7k~] 7z4|a-!Ԙ hJ 0c%)!R/9 dy%eGIaHʞє$ItY$R:"룎,s*HRw,q/{CX: $ :Hy#ƄX!E) \JWu*c☁ 3Т @N$\C {@ҘaglR3O)c1<嗓כFj͟>.OkK*R:Rbe}0@$>4<.҈#خrEmE5̆_^\ ~;Vcvp,cfGD&p摦jnD(NxQ >^.I9;8ٷ[MfU'RaFJ/\ۇY.,ƺO2l%G&e+g*\R=ekR>e0e+1t8JNQ5Yi o_g[3E$* T-+:2^g" ]}/ǣ:4ңu$W2:?<1fNS _c0]/7KK1ovC^(u8o?}uc`t09sH[S1bePGd컍.|*1t>7em\vdHHtuBЅs@%`(ƦgNmƼqH(򹖜N)bxЄR̻}UD_]R+.m"<M􎀘݁Mj vu\x:{h}Sqf.QOΚ98i-ϪUa*(ߝ>̍Fھ`OgןCM\al4Ȳ_ WR~{׾j8/ŊKu՞}~~{FIf&3;#mCžvͶ?x )W{^%N o/)|x J1(3pw͎GO4E2 B(FIht:C( JDubyxkU ?Wߺr uI@h}թsU ń֧]US.zbS.YY]?UP2m]7|[xs+5ϕ\xM}5r:=[]XM*nNn{׋f7mSޖyTWD6*.zW' QinnDFZuv}Z׼=US 9g:g\!s$ƈ k4ℨCcHXHpzp*_(|gjF>kk-a&JsMy0s hT΂vMXKd:ƔÔKĀu+¥ĠcB)a*5(KԔwk ۺR5]댜-M 3>?+L>"ߜEdrZɩgxn_o,ob O 7l* ܄1ȑZfkĐH)`G sB 0לQeMf=$Ã; Mt=?ִBb NAS1`nLVh$:@(62!Y]H .8>ηjl??,.ҚW|mj!|m.E,Lub$wYSK'Sf֟oM.T[W}CFH-%")L&W\+Pq 0EdpgRP vt3( /a]W  ΊN[lBrTa93r НV64qQ(6f$CMvi*AҌ[S_%NNuc"s0@0褝Mnt%iW-;#S@UW=|<$9_s#'ʸ*Q8 w*EUӣQk6M\q\L^Rr3-K2#o\ )/~2Χ'GNjoߟClsO%uPMkRpsMgCՈjլ1'=FQ^,3d|qѳ:ONN1U͵r֡gZge##H|6 3¯<*+W|4YttSR7iI@Rk_7'PM|\ER%V8)SQ1qOFGGd6 [܄URoxTM#G1 ?7o}o盯w?~oro_7O^)CMPM^  zgTmjUMn2Q~z9zD. g$Z>'/bF&w:g+t+rEwFcDƉ7fRX (Y sZjg˒gF}[?е1 d4L` TJ̕c A@&5ƴfs\J.u) ˬd:L`{n{C #>[VkOκjx:;;js{X#^OGD1Qbshj&yi.ǒؼ'6Vr[b/0bǩ\\;/gyq,zJHcHc_H84 pDVm/Rf,NƹxX9a/WUǶ(*#hqcDLLV:x0Y E A(g"(k52ArUr2K0h3"mp'و*餹 C %-pVk8ѫUbCZf%⢬bō* HGË Lb6@ 2}Rfi@dنCjֱ-x@ũ58~S.ۃ'B\;@ w-Ky~o?~ALMCSK:͙0WB` ,! "x ¤”-M<:J,xUP mtΗ!h]>7嚉wwL=] $siB&mij.S.r`I uh]'|>1" WEO+QOnMJQbp>CB)dkt֣)@4MzUPX`G,Jjأ%U85Tܽ>i^^eAīb9ѩbԳΝRiNP#W4t'ew^t,ӧׁye=bWib;xÂe:-x*p%rET)DmX[N*uTQ |T_M}5IQ> ԰3ClrVde+wTI%ro,@a&t;.Cs`߮HvޮA\*S52 *gyF9,#*hdTl 9]Yܴ t|Fٱi2>=4:Y >hND愘FN~{}c/?Ɠŧ91dq!(Z'C.JntrxNڃeYS]/>*,׻+,],]Tnun[^tQzBagwFr?rѥɸ{O )B:vFxB#+/iRx_ht#nD0ԅZZҙod(/#)RLcyY/-DT)k7{s+Nd~ 77'w?\Χ~zwܯ֮߯/ ҜE2$2)㬟4Bb^<{lT0E((D끑4BE᲋ /ϫ3t9hw\/F?,GX^?`VWGt<[ `F ?''eƈ/c=jD889^4&\S_A1*1'a8q =QHVZ)T7?l4@ wY$)bg .  mx> -q|?9!x*%~H# b#EXK IQ#2kނH5Uo٫#vGJ2w)`;@rc8? + )Wx6(_}r!s(W3y% k]XQ{6 phV1*C&&Z׽2i[tF7vCokə;9x~usJ!n.OqplX^..vͯw::g,C۾ECnJ:O6b9h588|v׭%}vyv&kW&f-'9I"gjN`O MW4F(E&v~°י8(O˚o[h[^xhd>cnXH'9CIZ+z.蚬*KqZ#,vtE&mRyso.{ɾ;^_Eޮc{0u&9Փ,BD]=6^䘬VLv$6F%6t=C>iQcC(SZ*`ԭO9WUVtSt/Ks'`K4J_{51HqSF[K!CU![%kdDiL QR*S)mv>+7-6yl"rwkKJ;i9›mYZlM] 5tKBb1cRN"1-.blt Eg)b/9%|~dD2hG}!:f%hP6lөg&UB%%s*TC>;ΥaN{ mW"NK%YBQ(#6hXߧ_Y*!C*Rq8QcFgTɘu!gkPO/Ƨ\|x AҪ*ҋqZJ9JJ-AtPz9)!K!u}d S)n/kiv`U -#Hj%6 Cb1m6e!jՌ$ԓŢK!hCD ɞ5)rSҨgO4p+ ]0+(IXkCwYv ӌT"yˌWh|z,`r`-OXUAEE'N@kah:~ԡn㊃HEiDM2U \.JC/ (28!Zn kBjcmu+y,X,ĕdf;uF5`k˺ FBB7 VX4m:tZ{΂n5TUA}5 e=Zn!zREQOl8n!`KU ݐ * *VjVa "3*T6$+ .ø)J V+dWVb@,AoTz a q2M!@C9eS<8coX#na!6گtYofVσGBb@YLu>%@ `->AAN::]:Ȭ AM 6ÙPhq(Xhq td(ޑ@ !;lՆ<5+mWebE-]Vkd5{RQRP1w4V^n!GAU3_ HO52*-J)MZ9zvOH |h*мGwXZIB>38o4(nv3RTiǬ椡1&bb󢐴rN]X&T!wv0񃡜]-&Wѕv;v`c{{Y.T!ja%a1-A7R4^76BO0v%SdIW=$V*X(ITXzz/3iyᜁ=h5'Hi4\`2ӪAU k2!a`M[hJ}$kIu <o 7`JrA+KP-"WhFymЭ(Z1Hܩ`U0>$/7zbqvo0]@,M$C.2Theh5S m,!{ԥ)G;~ R{ȋI}L!%:.x /sP#TϺ_I@Saʠv 顔K q[O @hg]+} %DEWHPl5CkHh'Ռ6XX-;ݑl,<~3PMȚR\(ΤEb&#EQ] C VQ;썃#2Ϊ ת ¤2,,H1#dȳlDh!0c+;QbզX5{-6OvOvH'Y g[=PxFDi3+jtꭊ#\_#moѠMjw$a:JĀ"`>8.a*tbV-6mz̃v;۸.~[_M?N.gy\G$Yzcը`c;VVO~FOB=V`&! >;-,fGm5EZSR`ΓDCak7vPޝ%l؈-'S p$Xh']r%5 U:( ,3 UbF(z Dekzd-UcG7,G:.o6b] 5r;XI7y/iLg("[T1"&7cȌbwXv&݃B @.FrL$PT֊!GCY.I D-@! bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &b3&D VGDQQ pg.=hիE@# I &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &>_H3(`cBx@6gpL}$ CL1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 @0 u<Zq4$uՓ@3 Y@ $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $gD}|i=KsԔzuy}۽~sA/Kku,π@9&p G<9p Ѫ.!J\z rWPIy՟_x#Yqo6u7xmQzs#/·?WūN8_yNp$2Cn xg{T0///Gt6ϝS.:ɋJ{ÍXt [qeCFu~TNqh:Ӷ'W5\Th߇r1[ br~.8`usw=ty;jd{@Y5Ɲ~Mof5(i7f.|{s.|)Rǭߵ{T{~F euNY͡w>8#tX,k?'C&/I 3WJ[rpAgt.'okwpG.O,w=^?Z-i 4-wM[M-qxPߞ@Tbx2|'=S'#O=tѥ$fru=)u~cgeg}a _` g}()y誩hQJ*\w_OC ˽⁒{|k̸IN;,O>.OO櫖Iw'Υ6O>AayyT˟k e6UWi1rKRH;ߟӲ:t\N忶ocWu{ɏMzn~2_|=N푊7#?-{6v0_w>EY|l].wL&\R)G0Xl0RgFgih?/oFגJnն 5ѕzP+eI2J dj=;_s93KܞN ) *=$@mϰDPPV|ѳ6}dO.d/ʊrJ6u%bHՃiETCvlՖ"BfjUBIw;Nsxnz5۽HFC%2_ژqʰxN>4E4A1tk?idBGɤrq֮w0{? fcz16?_!ۛ+ M>"mCWX} IuK1l1UЊD*IQ6c5aM.`lBc ^[Lb͸gzɍ_K mx-.& dr ˒WgO%ْǺyڀoYMU"_Ŧ2 ?VIK + CYy[!f,]?Eٰ-.pm1wP@PboٚO^.]U|PU:6Fx\#+/iRvJsJZxzN?_(K@]HKtY)BƠcr2gul 8{.FK! R~#Z R?i>?΄򱧩LtfjڻqwvI?o`!U`ZVB`,kQF"ރ]W*Zp5q6[8x4+R}>9hQڋQ]dyRX!zdٺfؒ45.oBHlY0FZ jɥ $c}Bv!dBi!J$BS' w2A*]T"+LuO0 @+7s`orA'IUBV%m#C vf&A:!ߐ%Tyi'5amDP Wꍀ> a!/& Rb!WE5jVDrA_UyZiV-.7?n.Zp|[zeEwGClmtN33{4H *MeA[(pD!lZQ*TAi y %Xu,'#/MbNRF` ^v;߻O}MM)MV4\r-cb6|VpeYvTא^T3\rJmd~:>?L2kU{š8]jᥢCAt}jmSND<[.\z'7.?P`&N%v)V<gL= &W4֓g )Ɓ`솝]*I ™LҵܟMn ^mfQ$iVփ %RL`җz7Ӏv1xxhQ,sb=OJUݸ֑FFZO㳗2%,luu9oG#hm~⒦4~Z͑]ofweF]/텟n/g/YgxTk?̃tl}ۅ"Rw7n04bUOS&g{WuWv#fQYޒ!4>FbG|'}kN2\+g{?^W=khLkɼm4 ++ptFg>F[mVw_>4Oˆ^׵ke)їLwRY$w؋hF)v &EiaISor/o.H}sǿyoo_~ܦPN+Ao_$f ]ͯ_ѵ57ZܵeS|~|i ױu!He@`պe7n{IN.kcC:=lN5{?6Z|qF\6ؗ~3AYH=&)"YPXc7,l!7;hȀpc(!Ȁ"zyNQ `}.B lYWނM낺=aӚǸBr՞S+dv޳TlbX r"X <2Tɫ(B>zn&QVWӉ'~V3X?ye [("(I.m9Z^#Y#tV7pf˄)4݉WE,H:yvձXwCR PI~m0d!!-J*Kk&Nxoé-Ju{w{`JJ}J]>j+D:Ƒ6?ؠʩ QҥJQw^;/~UeL{cRg`0 ,qa jkUvAݥU#ؔ | { R&hM#Rm<ו77_;7?t vs@;&OVoÔX6]7E؍m%ܝM7=m Otvmk!׊6)ݐƕm}ӥ+B/ D >޽ydCl]v4uMgں}ozK-/b]5lu{:y']#➾uC3ҫG3U/< ՜?m3e~glWr02A6-B-%h)cjE(m2 R5 ۓ'tRIT]-vO񪅵d;ĶŢ XQ(2); ˤ6*ᅪllgjZ6Z{hx`lM~lA^핯+qraf8LynaAI[rmVR)F,>ШukE{qQgc0!"%Sb.W<f-,05wzr䡛%Jak){O/ŷ=,Z?.bb;Oqqzܠݡ&Mϳ:yv*yO916>jՈ1D43uSԩԱNuHKU7" FˬSv2%x>gkMt+4,M]L휙.`̰Wݞ~\˹# :(*!Q#*,y w, ېTFA+(<@Hғ.@H+2ayPK͆.1RO|5=yieT)o^gqȼfdRPtځZ+B5³GB 218`}[d[zF&ǘgdo4 `8J&<1gLPerR=c5q6{zX//:_+_N}{W"M/n:?qӠaؒǔM!5Z%QJ#p'' P)'U2H"SwD4 %s3Qj66AKe4F)ch36IsH:jc0.GkiK6Ty s Yf&d1YeR$K2p4j \s<*!2IQ4 cDȉ'Kc*tUddA}!gKaXٷn$v!́BŵŒ>fő)zuiW&r0Ԅ Z5ҴtJ`g,uAM 1)6!*> p9p($r > Ip'`<(b<; .Ԧ&f#~@]3T XU)7m~\A;S&'mȚHu9fi#.v=P23ߥWQ'eE F&'/k40ndUב7q]k )kD& e+d0<$U N@t'Gխ"OIgFv~E~dD(gN\!#Y1it9$CVF" ![WV1z{%+*iWEV`/f%9 7Fe'T(ܜL"ЧR(H %)5 %9l_2k*,NQChdm܆(?P+< ו =Yw|&5ѠgE_F[Z{dR9F}0vz:&cC T. O:b"EBӏrF[kݱN JYUgmd9  gPdh 9PAwtLL) 4Hsyx282yW 9kb`HTZ| 2^)pjXyj _,&ws;)#C0@( l,.%`g@Yzg+uZmwm_K[c\͢{-  >mŲ{3#Yr4bS@XCC;<&Wܖ]n !3xm/d./K4L(+4)VT6*)OQA孲Z9pR~8?&+S{3;Oyu~im8=}s璘f9R7 b" r`ڡp2\T[ ڂ{ev P[jzoa]Ten1]4jfCCs6XWպ .$DU}7N>?cFА1IJ5O1xG35I)i'f3;p@ ,<ǝL'zrFnb.࢖"Zmq'`v;ĕ8J˰$`(^E/i$LgǒD#n<yqѯUϓ}6ԧD&"$*L;i0´Sv Na)L;i0g``ũ*PW/uEax1(ĥBXQ+PBXQ+Jx(|UWQ* _E(|UWQ* _E(|UWQ*UWQ* _E(|Uϵx.J{㫀Of:JJ=%+6 Q]^rkRbK*YKTX*8y$gCyDN# 8h 0{Ggy>CX֔s 2) EέRȰ, Ƹ#aREbD h 鲉].mxkK??6ߣ}XÞ`>>ê7d43x{?EǸ8πx LqYC*N0TACRGl.Νu8|_ڥ}nb&pNyå X3`NLj9+0Rp4N(|N+ L:"{"# |.{.°7ri{[kz탿zͻr84p>%.pK" ,Z]z%&cٽ'=Ber t!fz^dx.x0 %KBx(::\ܳ%33Cb2s\TNx5Gͱpl-T(a IGCa3aL$6%Pl;Jlzrr-WOdx(xɴ09(ё*#{THJ`9sa罡4*kc&PfTet[X``) NP*:d2llkq3h;>{km647Ųv^냿Izo~,3XáG^.uv:=6nFBԔXqE?rW'i8S}`v:zQŕJTJX^N--*#x :Pt 6X$D.í4<Ԝ*F. bAgu4D{SN ƨiB[B7zf)aKY`{983Ws;v?1kˏ7|hK'"tUӜ.b^ȼ֒!f]boؿFUc=ZcUDY1$BeVZ[mĆvc!%lKȳ.lbX `8M][+|s{} ?=^E]'7CW]&q*sc`0FHxC C iDD;\|"snlz#?c CZ)|ឝqIރ5\#rѧ)-0[52Jarm\䃙|2R"?_0Bjx;dM; B#L9F'RP;1I0UlS  ͊N[lB\:$UX/.|j,iQNqd|=O2M#j[[_|yu~V`(,=JAu Z(8{{l!a|v ja}oOm^S<]kmlz~fSzWsfl%Gٴ ;W0]-)[jt6lfYYDa8(4i' 7ܶٿO7t!*A[wծR,mts$5A}:TlW%)>QsұR0+N]-{?>Czkxr߮% OHNr3=?h|5'=VnuMR5w:*@St?l\&~o{}?~ߝaٻa/ipkL?X_$@O_Ѵ47o*iH^CrG.vݕ~@/}:-mCl NU{ u4-9}1ؗ~u0P^F=iR;ɟ:c3q4 H1~F5R(D3S[I !mLBPatrsTGWªȩ|,ChEqѩ`!Z9x=(BL [S0W[+*!QQdӉOs's :U X[辕=$ $'I*TcON$'0M` ݜo%Wa:Ņ?LI 9LULY&TY1&9%sO{#x94VR p #{c͚wFڈ[]jmսqPWO&͡^+k)X`"k"QK8lHSE57V"FMsFV;|=^r ڿ6oD&@[<.pEs! 38\W=_˩f*ɴ2Wy> 35c%tWLXiue495Ũ)4HiAa sD)6c +U&UD W D4N#+\YURp#s.[!2w6T 8.2C RjeHh [?$S`h!j)1̥˛ ydAeh CRa"Q"!0G,s$#'q6qOjL4;0e쟟vB[.c ?v ?Vɓ{ejnvK]dbNG"*l} Pi|?TXm*U1xe23Ń!ڂM10VN\[7ǂ:?wNN\,a3@ %؊g*F";?+9 *?R߽]ko#+|wUd1@%{b vO[,53쟿ŖdKփvHbK4I!YU00!2P)1W%NpfO*ӚY;fsUPp!% Ke_V%Ye sHpFFleCbC[,ö)_ lѰ^J&V3.rqaq?6^d}YAsJHcie|V'YO<K}S(!Z72Jheցdd;fHZ:74ʬLCF6P ˉ}MW.A t (rd-1NDe!CB\F{(Y2亷BbƖ{idd3!L)eUUll/W/@%ІnhL;@V,hXHA%FKt]fhr{FYrAԓ"1gLdQI9(j3c58O3c=[Vr!\o{.矊ej_۰<~/ 4n<~[gl%bʦDVrh$Ht$R'8 D4QGQsQ0齐Ц&h嵊Fh2i;6 e:Y0XO'qMb k6|`6SRx$CłkfL6!L)"GX m n]YL+RdrE*⚘r%Mc*qޙ1CdDu2UjpNԯ\ 1 #V2̈b`āfAdӉY){&V3/,$W%*ʈ9c( h٬ ( xN0RB WAP;m58O3D]UbS6JEY/nTDHt/20h'cH_)TIIg^|^<>6:CUa| vگ<]wkNlOWbfp nJ=(X96 {/ ZiO=XyA C\bR `#{CWZ4+B˙&7tUj*hQ:]J@W$3VUl7tU ]IuUPr3;r$}+UUe_誠E{tE(5]C=+ZBWSztҬݴ4KKuY&Ou[nB~4E-w8r6M~%pT s`zӪ)ˈ:壥FAm.╾8a 5,dΩ_n)\pմ4BjJflxvQi{4槇Qr2d/lhvqJc*Y2qiНt\yː*UnT`mKƠ7 ˰C//l#M=j,6րjpPd0l{mJ5R, ƺ(-Hb`Bp?U'ZP ߏ6=0kU/tEhkyb#]!gOtU["E_誠骠<8]]YV?tU"-𓧫Ro]o?o`]^q|?ڴZJbp?xb{Wv]ٚO/J.2ٍeXҗ%-a ٮƝ~4wm~ƒvQ)'w#y : F Μl:9H;bDa ,>8/_?ZŚN@.-pLQ SF̘vIGagܚXre !:FDZF~~+Wt _N?ӛۗ M}@':ĨL,XN Ĺ`'\ANqD<8qV'Pó~<:}ѬRDy_ҷ0J7KGo_ =m5'#QH#Wsr9$CFLhh0yLe`0wI-[ӔX2>2ZbcΆG.vHݰ-u3u3%M{9?o7~ۜiUjuӋvw׊ԗf"L1ɚȭE\Dg$'R\vtj %Z+A =JYr1,Rυ%^L%Yfu\ q7fޅ$_ELe`ʫ1hKf*T&jtNle`?Zbyh_R@J\V̤Df *I#&BrIEMvʤi00³0Ba<@ @ 2Fk4,6(K]de%d%M$@PQ*U6ҪlM|fn}۔.ܒ%.&UxmwP,6 o^fkmd9K3(3ƄșV9wTpL @xH+Z@fF/e8 060O) 2/ӄ18e۔!7s ldIٻ %r$#QolbL̘>sB8jnc}&}bÜa8c)P{E? JR /xY*苦44~k3Gٕ ? y]&nUFʕ{>Zl]H=> \LG<]Y{]Og_;/]a(,q*ypٯi2?Z{B:k~g+lrh{Qwh.FOc^ z..F7WaSWe^Vy/kI.yQzʽM3 W4{Ö 8 Kw;V-&X*kUk K7ŕ6n5-M:{]Z=z۬x&Ghx< Ã5^^/%כ*Zw-׋RXQ\| Y0lotA~wIG@MJ/yXyND]~m:.8ed4oSiӒsl ZjON&u/|-] Ҙ{\x. >305~4ih-K==o=&r] l*[ҴvU=v7)0XYo_x|KsHF13uFTnʈQ 2>y8W"[!&a+VD{bWgۻ:S; mnG##cE57˺GѝO̞G3J gpTH6lz2fw`>WlNW i͑Wmظvy]Qohc|+څ'U7]s[4NDڗǺ.l T|u!\_e(..\-N37|L}uQ8QHs6GV+%"pBߩ1qOS-WN2n )yeBƠ3L&Y'ϑ;{3.`L sн3mf_=-7{]ظO˥#.~mQ;vrvRAe!&E6ъ Pm# zi$gT]=zgukkwI*Bpl{"99n$Isgp58OG70g̝,O =3 RBt2@,wGj҇ 1id'IҹUd*:Rz c26%RI:f-r1 ǒ(p,J `eZ;<~Aox[c5s-C Yzs%4 -Lb)`yhI4qek Xejc⠛M͛M4'tt^iHk̳%#w1W< mP2UӒ J+j泛ql4@,hdTdr0xHtHےdw֕XJx3r^cThs玩a%k#7e/i"Y4 \74hF,+bC#YDif`ɚn5b*0-qlj!$Q32uspRf#--74Is[÷oix] b >wys,1|׻4-^䗽ɻpt9\B`wAg˿%7犗f̢y3·0Eek6Wf6yQkj.,7*\J;J%J\Idje7n@ qEA|S췆e,`;3Ub_lY*/Y(oX[O9 U$JaAFH!jTɧruuS3KR֥ȲRdcm3de s! F&s3YqB/(.4ϙ7Iw/F&r0 ̓DooQ3 thPy@(~Gؐ Prr+o,h z]S0b RYb93WeD ̱s1=ŴT^iyk3- CjAy["fܸ+N-:6X;'8jfgf[;$M4V LyV5J%FyC+m8x^u{nks=C0^w"F*+UrYg%z!c)5hRda.B>K=QdRB(hxTI$5\h9Zϲ8.CCK+dwpYY9m=y˥e]Zn<];&{6qtN33Zxy)-+G3t mN\Yq&!ꉒP N4ap1.[:\^^:at(I֠0L6AF *YʹA1i 0pBi&gv4'&J[,3Q(,eI:SL$$RY ő>vJ;v_ Cϵ4m2 D6Y3Wed2e̐DiS gY&0C7NHovRgj}6H q\/A63Ys+W<"G7lOҰTSwsi8+Û5zev=eiE<)dw˘MK#9g\2qpptWs4'jf`=w]bV{e3\xEhQ4_F.sal=u?ީkd:J4ogM"ћ7שWQ eJ6σ]wuZw,~Ak1-!JSFw&7#ں>s{ajqy?:c("Uzq:|pibYKȖN5#6/mfQY>Di8j1FpRFpp5uå Y'{ i2etʣ/Ƅ_W<&w=ki~?}z^|}7PJ\͎l>HK&gRqaٟ&:' ^>.t-RM6tUcpr4I/|~>|C.~rB٣[&GG #zMYa}ӖrVKv]Nb !HaDZyT6Ji[nsBQ=`ڇ0<hecDe?#Y@F*2D⇥37>e4:3Io;1"3B2zEN!`}.BKlq+oM(댺g̐ivC CWBv޳T09tb 7#M2nɫ(C }4WM*P;NoE}ކ[kx[IMm!QQ:oF+j_3ZfH)}ƺRe Q7iW{S__].~%jץru,1>Y*ŶcZ[bZ=JN+!&mzW}I^UrӦ)ҦiKT\UK~1˯'?5 ֟.q&T95#|]1ٮ#S9JkƤсg8`*A>8!Y֪(8/{nP.%@`0KrEf1Z̑읕RmЕ7_B-Ud[wo|Jӫ)vx4z/Kj]IS]>evͣ oBm\!ͩ+vG~AB>XmZ$ݫO9yWfY6勇nU͓o~dO^@RJIϛ]]qMtf"׌{k<|]3,zE\5gݟݺiQARb׺݋nbWw/0%aC6\r2!8,ZX`Lkf1g]̀*5Ba+)M'Go-((EmǸ8a=nLƍY($S NBaVCDM+SW;eJTzwXd:@V Q6Zfa&()9[k\GA`bfj,]]{GQᏏ9/d9 ho*`UV sWUI r!˺-{id("$1B*HW2ayPK5>cğo~ijz7YOR^GUͽ>OS6Ը!vIYʣc!H93>HȨeӈ1K.zRl.EIt+p@62Vg=2*հf< uXTXx#0w4ajw A|DLٔƠU B@.|T< TxIr ThLݭb0cEBRkFh2v m`XC,QKGø`b jҎ6ڼCs,^3dɘO92`ɳR.(ocdbU<-c8L̐Y$!HD4Tm3cV2q2Vg=F}I|֏ b5x("ʈ(:DqQ^LLV:x0Y EЀJH3ZͼL*:L*"Zid'EfmD.8lDH:indȒ`3B)^/jO8%_g5-y(.ʸ(;\pqeDv0t΋l1/YO2K#RH?$v+xXM;PaxV1^lsƫD?/੢gfvlP a˒#:4ihl F؅=>Q~mj$99ޜyӳ׫M,T'}\pk/ 1@)GJ2K>&t[ Bg.C8FC{'^4y=K2v-taS]Z}Df}1OQl(h{I+87Zlyf%J7 d9qt685n$suq z`ߺ*Bۙ0[2aJ-cY݄}.C{M3aZW7L -( FM<"1ALNY3 .wQe2h6!4WEQr#7BW'5OI%]3_7 -S,i~h,2X7b|v48X!,I{0{+'3"AL%q.C H!F0)3؇BDB+'hxn,A(Blg5mpl94q78}S6dlD2)rqA:{WVKg٣;klw!m+y_qf}bq3u"զ n!zث/M?W/eQʟ&_oz8WBhW)ES4xdŃGi5ZWi0qk{rQږy0MD?f@7\ܚ d~x4fWMTЇ>^}p>~H>J. 2zNF_\[ iqTaf= ޵6r$OvYX >CaSE InC8|(J+˜fOuOUja(rK'81:O>zwGT%tGV (HOu6&8{( ӻ&|QtTql'WY뿦G;<|y|g ~kv?;D.Yz0ӂƻ_<@JpLRH]e2W3[)Jƿ"ȮѺC2G=y++)PwJz̚Jxb:䄉 jA3T$G)C]\9ԉ+i]0m~-}>]{S'M7y^0N2 %ch.oCI 0 *<6(ORK \Qe'EI{%j!m}ѵ(v|ۥN=RGoS=Zkbu-qlcO+z=(o&Bc`#紎0"iS9_BZ,] u%S}(T>/\>}NoqޅEׇvOg/cI -$:F-F8X0A`,GgI'qLz#;$!͝b!JĒA@IFL0RQ2iEiC9izFëU~'vQ18j݁#6OZ(«UsVĊ'@ )qPp Oxj%J$8AHdL꣢)2g4^NN:[rŜb6NRI_'<<4D4!)U+`rbigSϬ9^;K!&C ՚R.)bh5rCz A!D e܅NIRh'uZp`hQ{Җ]m'߃Bx z<^&#\eߩ!MŏXTT|༳~EF8\$맣:iBe > W{uʅ1 bNEߦ:R_>Mn^nzfr!J 2ZG{j.ӫ;ޏnAph9%EL͔jj?fV.'9Hdeymyژ(a xCv_ mԮe%6; 9n-N.̕,aEf){О9ZnfDvBVo;HXhiJek.Mb$@3Q}6ukfޛąxڶp:Gm9f+hˠM5*ݽ5gc1 sXGygsvɨ^E9Q环Nv~VY,܋5ZIThW-=Hi,&l'7D;ɫ 냹EܒmYa) 9tJ2am}nGA>maĜ=>/ig;#᜼}wRX?iV=4;5:ƛw=I͘n؟N 3AIh˂{m8l82꘧[n ["m>1I~د\Xx5b̖pLMx'g__gHZo@<}og\pd%i >B J [L}]D>/tdUr2(/_`C-!كRy])ͼ( 3A*ArjN3M*;}J BdU1cx%,qU*C JYIZϴ|}^B*FcJdd@H}Pe Ő<*ekQp))\"`c IJSOT%P/Kψ1'L+Mx*PZTj]ƅ7Zg^3Nשbş3A)B<&A{?fa+# eVАLQiiH*'TIVW|thuZ:.%r`+RXPX;"D锨#F*)t.#]L MϤTP C?f7o-dsqr5۶3A=:]σ7auYdP*TbHWY6}l~h*KD袭u#^['U[}ksɋmakne<khu}5U'oݶ(kg5ڛ;ӻp2S7SwqEjc$iWxCn}ݍ"zykǿ6T⎰9":OQmխV=@ `TeBH \<^L=^z/D!hr1Bpm4Y鴕W$7 '-l%$ PchtR4Fս `fouALg^?y͈R|' }ȒqaTBUY(5C(eWCg}[ ΧA:p,%h.b띔ZNsBrO@3^I(;ED;ĴFk!h9-9e^!fQID~;D#g;:jpZf/Ʉ}gʼjרK_ŭS!4E hO/mz^>/OD;UGGyr}]@ȡ@Φ`1\PJ&Ĥ;-&qN}uP陀  fFSBǓ@Vјw(*g$5"EF҄%5gIzR.rbb(g\}{b6/QscNxQ=Il/@YBO Kf % n7/ω(3?E@jQɓHBZ&DŽI*A"1;Uj؟tv3mR.H""4S&5x3"I&;QFqWR< . v1MK\T:G1OZ"WYMtsNIkXg;i+AƱ-AOEȡg }a>^dRSA3#8!+ap,'tQN<`ۯ"8Po z7 oićCzȢ+@^q2HM)؄SGϣrXlZ9-P *z%* QPP^J'uDuJ'7yQ6q  H2Znw4D\;b@9/steѹrgWH# '3ǑTb& h @(Kc16V/!Ђ\:ppyQ( ?f{7HkKлu5LVVI5}<spCǒ,lՊjjҟ鬸#$]EsP[Ӥuoܼ\|nQ8[Aōr=˫ڂ킡Xm:+7-n? B4dXH(Wt4hofY~`Xc`YrBc^\߯28`GNiԦv>ꨄϦ)KbB+Ep_Sgbu!N-_>~~!}0m?ǒq(j*Fr78fNz .Vk Ԛ*Sm/t_/0~xwOYz/)LM$ݛEb{' ƛ 7mhs֓b\ڜrø<. bqM!Đ d~_n㶿%pQX#T2[a=dN No^feI~6ݛY X\dVXFW??V+Aq3C`6"#E\D brHS:2E YH_ Rt #:`3NXH송"te4>z+%T+PGO]N޷Ѥ z+T`P/נ&L՗hEWU+tVU\ f$S .=*H:{-%c8 з1x4 )xkb&B$%*=;o|W~{ l j2]-vފö^HR47| _oARM_'9UdFd'NuپlG[V;90NDVYumR$@kQp5 ZhtYf6葃yo9b"r.{gJ>qyͯYj'\@w0 Mm):f8ս.?ߢY%(tSb7^}qrnzr o1ڒN,%r-if6/VB.~03"\du=Һ^=-`Y_֮f!e5]_Ar"ydZw]ekTYk&!j&LZrAbJ%X2 δ3%VjjZwVC X"Rr*\ aV2"WxJ3g[[#gtH]-!+F2U uF7]3[ϋ!/ 2"&psq,~}<6P*/jAqJĄQ !E'oSؤNG܉NeG!FˬHvL "):*:ům>"1t BޟOd T9"hKUHhe(Cz/KY ^ ӶmH2%/?"&$C2A[lQ<XJ#?coOv_J?&7lw5Gʛ8 AO;@)IR xƱ&03}0),"2:ǘg01=)`z]>:E*G.HۓUZ8cG,],ƒbR"7T?a}0 Ջ?nP~=툍"KRcT2 >qn4H P*I@J&ʤFV2eE{&(zQ u&ݎڄV2 ڴ}9qA<n;vEm2j; vǧ !b:@I,O!aȒg \I5a%&o$GҐI2HZ4 kb"I$ f,# T'.9VKr 0 ""hE"<`ɂSA@rxB#=U -f3pƔFҍgF xR &אLI 4 |h;hklFW^'\M#슋2.B.-LJ҃EȌs^dd m!IL~@ JJC߀.[][C a-9kWa\BޏSfV6E nC*TNW$4s{xtsKhR^H:V:ˤ9rT68Hr|txgv5um¼ @Ѻ 34XHxN;aRW7HF5(-(5!hJ%+u:%n+gg\0uJMA pd8*]\+p+aOI.XUșo׍` ӆZrn?\}4}<:Vͩ(.h> pPf.L !K՚\Vv8 BSf$Jxh$}h6%=\bDv]h; Hkl3kRsx"ս\vǂL\ܗmHK>Ԑ|𩚺ԍNzvEh:Knt ZBOV^6dĩ$+/ s d`rԒyBpU\<* \j%;v\}p%%h;?O?_{V ~I JLM<5=G~ B劽#_H Iƀm?g^kf qhOөLGHJ[] {,袌9A a R>\o n*J7x{;?~|[ɃUIT,VVK1d:xq\Y"gqlXV%;L8 !NfLpy[95O_6{N{=fvf+6 $uVU\[c(֑=%R t,DTTBaP)yb*2LSB'W\kNZ}J\}p 2~J`N V \\}qB%v"\Q'WDQ d;\*e-•Lh}BpEU!WSB>zPi;+gL eyݺQaKzXDt^ӇGyt}1#m32b%bAph~;g4ܛ=&k9+7Oi MJQ*E"C\-0^Nr69Je~bv;rđ+?~N9΍~op?sZ{L&'e_w[_C&6 6y>Jk!`=˕n WƓ UzX}c9C`UrYWTZAsl"ws0r(:iio*O5DLJB:u'Qj'Mf"jDB0 ]±$J#肩L]caexա>F7GMDY5Q ^RB 9 ;m>e,v2j,[mV6 o&q[lbqRhFA?omTDC6*Z鹎I8o 7YKJ߮BN~4>_AY3!xLzt-5-rsuW?~7>  VV)_泧eYןey$sfUC-(ՠ8F!=J; e4@΍b`"0ۨr}ZtWa ZH~r1%*WyXPv]mk,)S.@,ld$"DL!*d[1 cr ITX"Rzr1. X[cL{MVAn;Shkl6?F#Jk)x5"μ\PouwI5:woMf( -WBH۶Kmt0'52Z)Ӓ֤2D'*ܑdAt'6bLs3D$F*  9v&=ML MeϤg(Pn4֗gj"_$U9Zjm~pN슝s+_%/`f?[g^0ÌAvstVi}wr-mk~zO@T\~"rO-ji)뺈BlE _{ZKNf ` R eJfYb/m x  |Ѩ;rʠidXXmS &/D쥴6 |YFV0 K9,TDh,.7⬗Pk2 7M.=[s1|b,Ȳ`o7,{Utvvl(t?60!d`h;:ɖq0!r0!ԄZ(1gb¡U:@pA¥/A$ Qdt$'ɐ56`! Q0j6`=(ե1mFv/rNNpm>npLAC0ǟ`cd x*s* }ЉG:J ;cQj(4`ȡs}Z['Νh,rj%JCBh@ [B(9ȂMR %]#M"cZtYC-RE2F/Wʩl.Jf3r#ʪC(g-r04+Gҥ[t>C]/3Mq.K:Gs6RZ2:᳉JJdj#+c,N Lr^rJH_zBO@(|\+,A\i*gAkE1gEeP(0;煶DԳgƬE𺱜5#gG9/@̖oaT~h$o08X[oYIDoI)I"N STETg/{BQ 9b%/,:.ԘDTD)K JY=@vPfbwԔ,B1- LЉɌ"T·B XJ(@h8*Aj3x RWx;el09 օH"pA+ps1F !(b݉v~/}f,rW·  l61>5IUbB ЙČA BA<ډ u49h!wX:xz>+Wq>󕎸I7M^BHԈe@lD&=F+UzJ.sI/ڇ.~JCz0ho ByQynkx9ƥ@$;wJdxRs#uzLݏL|B^C:PC p%fFـѱ$u bIWϛs2,I gtDO& 5jSV19/K"և 1_\սg鿚|xcc8woWnpVŘJ/jYK^HRf RhcHƒ:?|ZS^SɗM9ЅC P` zac`1[ Us qW{a8z ='?'ۅIFRYijeScNe _a <>pcmMJ"kREܫ?+&#z!H>\L6/4I U3Q;:kk <߫wFKq(ޱ5Wy6|Ϫrg5eqd}0LO?g0__e?*,'kI!F0F׌z\?Y@",]()SqY0x|q0<9Ǿ[[~G+r0Y|1dmGa\W.Y%_Jn:MBiFw,/FW$U}m_^9xF0THSNxe7Mڢ_M,|{!o!%9Ǹw+sZ͛WsŗŃb@K8 чjkEjg}J߷aZ<~ i$G2pH'Q1BnfY~dALy*{%h2OpS.%G8}Z}є&b]xsxK7sq埌x&g']ܤ]+U2U˜)GttzB}~|o/T_޾/y/.L5&7 V܃ ]yP-gXƕ72NRB1݀sᯫj&άHOU:fp22꡾Ӫ K0 n.M?{F'HxH5ygazdrJj# ɸ"C=,UrlX*_!z3,OGpn5NHzETljSB$Y\*9`P-̎X6v&1E W+ ^qWԳ6zC$Ƹ!*wV+5s(e A#JI62ȢcvFOOV`҉6yC8kS~o۽|mg{W om]wf; 'BZFپJ1~5֝Я) <}N!"AI08P`O@IڻM!as9c [RxrjJ;{HJ5=CB9E#uLbҘ,U)yW.Iׁa5#gSn~ Ƴ޳O2>w&fOL=1S׫wpuYmC,tحϊ얮]ONg}/]i /vۻ~TDǭ;nϿosP1* εC_Pprl'ͭp^UQ:G7x>J O٘}bEk dLxB,;n{fPGEךdMEIҵzF- JͿgEa t>}0Ck `zZ)ô＀YtR!^+n:z"tnx:/\wk27@қ@d Qt |d2C0H8'@''Rq 5E! RPX))\!.HkRb4E@*!!'Qbݕ|L)A2JS:x1:lFvt_<d8jG|6=0=e1cO188&9 ǽfsEUaԩJqjb邉Zޙp@'7 vPhi:1W2Ef'/|-;96KJ E]+~i3Ae ]%j4)[/JRIH.zیd<Z_cQTt[ѣ,SYb I&8A#H١xʇ̘a)k_pUTr.nG>fϴLtA[ jtd!bdBS$mGlV3Zt%`yMSRɄRQbˉQ(Zkfl׌J3]،3 AՅ ]J8" U`/w^Liؾy p|u6㒻EX/A/z \F.2CX @lAZH뢝ߴz!fqW}hCs7}*a~gQo &Q e >+!̓ jY?W<0bҮ] 32+4SVGLִ_ֲB*}_~E ?3 CɬcJw7Im3|WeO~ww^7, ߯z f4]oGs=3;?O5]ms6+,mheRW\UvU}R ^<Ȓ"^bږlR,|[xo>[>]V- }\>eژ}7Pg(VZqE}c^/%[/w[Nif19m΄aB圁lNi-۽EBS06V帉uQ1q9]mHA*b9RŪpꤦ$^6z6z wvIQߒLDžxKgtxP, vcUQPX V0aN! 24UTsSHĨ wQ+\Q =$s5w h)1K + F9&}@`JxLpHn̑s来$:kQgp^kUsxeսgGͦ ]7r>BZݎ\䓻|}LzِT,o|LێP6.ʌaEP9%& qC*Di#JP HoC\Ќ‥$;9G!#]׉]8$\zCΏ*l5 ymˮ$_~u JxE8gd,XL(N\d򮬷7QYpꏼvdrV͹-ߜȿJ Z{s&ԲF0 9D Ɯ8G-*f(1wvU"+_uln{kw7@kǯ^ٌW& CT4:+u ,);RxHbHRxG7PNB R0OH4-v٦GC4^(ڳ/ّ ziA^N'x4Ua @RZ TaQ% „U{^ ݻӃDaN;rf_%Xh&WLfzg='NKDU~4^/_'UڶwTi'MWxe%"PU1=|n"¶ KM%MtJOt`1ͪ[ǻbGbg {J#7Gm]mst] ެ-ImRhR{}((Ÿ2*y0> 0dTqL_OF?/j`P\oK;'Gܮ5jղfgX5yu`.k'd)AcGh߽J͆xױXf4N4x&> ㅧ0S> !èVo>v^Oy :t㯻NB~⻝tg@z;;u{glFfۭ}.uo' f`J0>ݬ3Vgꋭc6>s`6ݭ`zȐV09w ꩗lZVOT̋UuJ͹ f њ}5gHᵷ$C@B VA Dc# Īn!I)!N uUv3&?-oQ-()7 1뵷AK ](JV춈Zdbj_1^'5ІY_Pu{֡_;!צ; {Uo3#}vgnn@@^9> -zI;?_[V)Е*aV~YiT=\U^OmQbS?}7XUPAU L xZ\3rTZahQA]"z Xm__F(7ւ%q_y߼wc=2F;?Zvǒ4/ I7=e5y6A<JF~߷c{xʯՇ'oیY~i0bގٵ_>e{~MH._\򋕩q=*.|gxhhBn(%W^R+HߢT5^e=VO" U0V| ތqDCm[} 抩Od??a.9Ae[[qwhKE2jsGiShv5[WMBTmO]#ŕ)nFF}),*3A`Ye~`_LitӑxA߻h2op'7监zjGp1>s]jTO1LljSj|hyt~gcWR-\t[siȣ= 74U]wF:B; ˹33$gfYgzo65{:gӕ|ݹ~9@#[]כI,Ua*غTI]][{Ř>^*ZUFK[#3ˠCQOy>W8В萝nCSCwA8fÊVP :s^sL{;P悢v94]&s,=vbw9#J=!;]NƄ9}<7VkR21jҌx~\VRRcJi]/.e&G ?ɶ{54yVQ(I ﭧ7heɯ=|l |LJ.a}<+l&_(#+dOrFOiWF6c}{˕ݨv#Tv0n7vF.ƅ+ƭ쁽¸ƥ`u~bpUIXw*"T+v[iv1}X|,4Wv n)]`‘tD ʼnh{'[J!*Q)Qtu K\hJYi&~XAL@G0`#?ٯ~Ə݃k><패J_;VSWK2,h6.}^QY7h[X+4aM+k ==DO[rFeׇtzMhLF;Ha,)vZ љ)J{P`rdWZP]>C[hS%DWXt".UDi*l3+E`)m``Fu*w 9ҕuDJWt2FBWm[#J5XWoxCKDDKȶ[NBWU;gt[j_cV U,E2tjE rNW%]!]F%e V&CW.aUDt*j3+:!E:`+U*tjw遮ΐӜ*,p5N tQ24?]`q#\ ]E8' tutECK;JLA۳ N^S1`nL&XBiTQd1eFTl l[҄Hs!'qh;F5G{ !Y5"!Rd *RkmF0!m0c/{ nMQ˒W-{b%KZ-{AGMUɪȪbR\} W_SXfؾ dWPZUV[RuߢJi2l(}5*^ q\.Z+ԫ<g! SC^z!2_8Aܲ>n2W⒛#ۻR0W3WN=&gUuoxX .ݴb+?_i_Nj]E\ ogP$<eЖx9gek=ӊ{UDžf6\{Z#dsۿᅾ]G8ẆTr3OD'-%{!.āx"*tKyhbᓂ u ˻,$)8P_L!s E N`3wmh.2rfbr3V # ^iCaMQ04QlLGْ?MAS@"u.kqB ſǤ:Z$BBC"\Rf)[s&5c.R@ ـ fVz2?o^26!L?.t"VUqQ,$2;m۷U?~Q?f;9{ VE=b%>X#>(rhjWUq [z\.KgոZkmTN'L]dAfMQm+~kcоkaQOk VGjHӜ_rg9tYkQV1&$Nr(2^0pөDb+b[Fǭoq.~#ʒ1Dy-mʈ/χ}Dc,a7yJKDa[Us@Lb;z?#|Z/xWkx$%" YCDwHSDJ4Tcx& vtB+0w\ @?2'Þ2s7Qi 4!id(Ym$:p_rzZH5\R`ѳwHB*pN@|2ciz ʺ-tv_p`hSEm{vCAlL0YFIsq3ƴ)`DqFjŌx^+l*b0Չ%L/c7OekV_:ٜSBOQKNq'8i.=Fx-|RUU,UԿ^;rb:yyC% ۋOg9)|aL/h^61浽MSHW-&fM &v`6.5}b|Mۋ#H|I[0ܡ-hK"T$SMM9+i7e񈐥́ZkMcZ={\7Ma5?H}-w#tTzm|솳5]8/&)(){.NG7 >A75M]9Yrm ί6Lyj9.LyS"Rݦg󔻿e`Jv'fv7Wjsnl\z}EN Pjx/sw6q?)fSĔN͖-N[++}f݌nݚTyLsFm?2ӂx'J9"ФR2d>&$ ,yhyydQBE tTq7`xYyme'Tr5V%kx75x8ˁYGN&83D*y dP݁HCI{7TgZ1zYC:+J]]>5nM6ֶɞigr.. ǤM V]\vhΟNnCV5zpl;RNeGPvWT13ԫ̘5TfDn%=ʌKfʌbeF$cs)ϳa9$F`SDUkTa L)h:hstuڪR_@9zAl6`Nzdb6(m%72%OR;"䘍Ԙ`T謏 } F9ᔕ@Tw푷>9C]~Uf r?{oAzgf^lt ƚm Ty  RׯI]j3t!+k >DoưO)3nj rt4@^ݍRTwst1%"!tS)IƖq*&Ŕp11GF+$sM=./m>FmZ\Ԭ4:sXQ;.XYF׮غv Z'T , [Y L 1A  /VR/l${W_,W68IZ\{%eȈ ,H!@##Za\J8ȅuIkHhe`$PO1&$Wj OmGV T)m֎׉PG>f?MbkI֐>,vp_H/D B,АL t*R!IVȡthml'5 XjKa)Ghi .CtR+eQD@NJx4aZy#% l65_lӼ&= 79@a2<'o lc䪇mukz/Z=[ɸ8Oo2* n#b.uU\Jt<)EUN)UDK梉% -WKU=BkC~a9$I<84M$ T#1][o[9+¾.!YC6n`eҍ˒ےa.%HlIY),VAA0',~;?QM0eY( P$"pa!h5[L)e:QpVŸZ:qx3ffD2F+mC6Ve4+ \f zrNX]7툐l7)AeUSN>Q"pF 8g~="W+bg瓱{? Ejnmn0~D%d+[V5#V6#_̢|`(2e%wݷyqu1^9peouU+pc[ i6GJÊÛPʽ#p%J+maE[ZOnğ/FaHo=oK&tS$w؋h7W^Vu-R^UMǤǽpq}N/??~w>r_Zz.JdU"HIvoѴ47kPnnzzmVEv]b톫ؚ$ 4 j_C9(c-G%y#Q·LDɻeAX'(DN>Zvls+oM(ƻ4Q0.wfp>^y,y61LEFRvdR$>W裧T99(j:J|rsJ:=O<:a~Wgi|Vph*٦ͬHguc]̍`ֺLBJ~UAiR_].Eȶ)BV]AgsdIiiҒqۧL~5Wk.q}N!UNM%HT]rk90I V[MTdVEyI[u7Zw)i2mJK>R&.2њ!Gw<]bTލ&Ζo 7wqӊsEhn=gmcS{ϷpmxE8]5E؍Em?N UOW^:W`gWrhaLbs Z_uG'$ `tCͤi8 lv4 dC%n_›+Qߢ慒X]}F燃}0W7T<qwgϩzHxϊn{xTY.YP.epeAy,(&=Y`n7!n|o ęhֻFqqB=I-5^K%Q%\t}$v!GIJPE,@2BDbYPRD&.N X&QUOW$rj kS>iVI9,#^2K6.:z!t(dGK"HiT(9d.Fo\6E!V`lJ/$}lȲ!/IdZwak &0d8dJ%X* B i,*P.u)/ 撬Gf- E$qp00ߕl5q6Y͒~%l5DvԔ%;+xR ?|ted1g_>' 5^d}^uTS;ƄPUcieuSCCJ|,Q2h@FˬS;fZAs !KmWo> 5 /~f trA2іV'TҪ5KU rYdƖ=4 Ie[I(Iz$FDbj^1 CZl="XJ!ܖ'O-Jv-Hnla*oL%?UtqZ+B5Yʣc!sg}-2%,u-Fc3o`021[/KsΤH \2ږ8-c=RVӌ'BB:[W[tc|[̪tXݍh\?ҿ_Ŗ"lJ`U%C@.| ( 2BrR%T! a(dE{M&h鵌Fh2q;6 i:Y0X8-<]K;jye;!u2HdA&c>)&#,yVRm .VELĐ rI,1dkb"$%M1UdHu2vjl-/o.b*P,b5xE-,bgFdă 9^%*!Eʞ UVfaU-EΘ6qR4lֆC4Rbgɔ"IyF;!MA /`lgaVӎCYʧ=`UEXy?#De~=-i_5 \(%FthvqRc$ Q& t[Bg.CH#ӡw^4y?dZ2>r@c!W,V&58Zh&kvBF+B2, Xa\&D2mpk5hξu:@Q0eV0σoe; Aj+CkPԄ#2 Z5Ҵ߸mYpHMA p 9*]h8 \A< <%%]3 Scˋ7~Yd!+N>,V~Xe w§=+73"!H"s3s‹a˜e񖎻wE)9p$~48JK8"-CIR]8&R~v|/X{#7~u47(O3AxB5٭=G~ i|ӻtlًwfD+F)-i=A6~~}_j!/ tZQS?kn 2]^\_ӫLT_59}_i;ibN=Ls4e4YW[S߳2ls/NNM]&VNcN~m誥o*oHbcM@N%+S٩qIz3cb554lAʪKKK8nO>¢.eò{_Ӡ7r$Im wzgr02Xq{5}y 71OMpF2yn-,Kx.af(43(0k0\aFrw?a|'C 4I#ꜚ%J.u^WzeL{cRg8`NKAmdaP;kq=oԎBw솪U Wr/ZFpIĕk]m.^Ġ!J:-:ntR^_= `y~ ~4 dC%gͶv~S}͕O5/\7fqΣCM:UL9|C{=#x;׏+_}qz29-sḘJ|D-2Plef++'hBHt㬅udcLj +go2gogo8DYVo)4Z_4})xsRڐgj{H_Aek/Üwf/;{Z#+_1gh4_dAc`h8.$2++'"##$٘bрnm|ىDbdjɩ% UR0&x)Qt-o}XwPM1tkgE{\:췺@w~^r7^}ګ> I7~n14͗_Oӎڌ5Y,2)*ބJ)A( Yʔ7MjW_z Q P'J.9{tB4P0:[عmP=I .\Pz TA34zUŹTĀlXwg^$G{7 ~ ]S!DV'J|؜k+TJ,͖<2"%AM 0HM_ g@}~bGԢ }V76WbPf,TkLQ|rMof;ivW/)&xK]E񃪲HClzu6(ab<XՓﶥ0W5j&" |;4mkx׫+S1cO"@Ʒ+uM)9BLqֶqyAoMa"r_ʷ^zH}c]I(?"%ŃpfjE*ّhcz6&!Վ0X;·}kty|:_.r2pHM(ś \=^~;(E6fLaA؅ >9e\4bxg﬿*]Ӧغ,Spzm)H\#Z>eQC$U/9IȂJR&lcNcٹK+{k y8$!S%vՕ N`[ .;;wr.[j> {|?aԥJθ;HXg`ȸr`lHt2$Y3_T=8W}nJ*x}#}b)R@(-ICݞ~+:8 ^]m<ۥOnxEsOVzV,V4Ԕ.^I",P[ NJ~ſΛ(4YժDY VĢBS{lÚW8lfEH}RÐs%X,ǾH/+j/d:%lO`ӿ>;;GssK 2!=SW xYo{D/SҒqU$vK1'ueXY~rY|OSZ{=k\=4cV= @wN8xμb~Wz>X(aqvl_TZlLMywW~~=m8\RT'SNS~>RY!oǝwqQґ&$zrŏ* 5zNgi_wk*tk_uxSH=~JMCKtvͨo'wN0ɛ^]$gt?auޜ-Oc{Qɔ՛<܃Ħll75!c76sQX~RE,uy4a`>->znllͭorSZ]]u#aVJt}awN^wQ/:|8e-˟Wa^| KVSQe)N_K\WnU?<=߽Q%_ߟ{}k z\}.'5䎛@ዻ/3_Ё5߯~u(t{Ȼ>;k[DSL"!G6,w_EF2l􀆒4J)(1VX#AZO7 @(ڡ9@u/l( 8eޜ{<8H-dZb/-K!HV+!FC5q-K*Iue^U1d+γWr=r^50=M kgC%$G1ݥӜ>XՓ7(Uiqjl}t%b aqP2FUTcC%H (jL1,OCymeQj,aX y{dTf&%HJ3ƪvQ  瞅 al&̮r|=vmwwLyqKk Hn-T=*7E8G;L69MhC!2[@1ZLcN}V W86 ݅|7gaݹ*B/.!WB;!po^MxS{1S'Cm9`)_:(Wͽ:E J=e6)әIfmNmc춇uޫ/Y'5.lI zP (wgF./+zDl2*ӭ$6ZYL%ǔ%)%f aݹcۍE4d/o/|IJ 'ȭ z΁રڇ78ѷ)qlZU2>֓u-MP_mTI< ZjeN4Gީ>/?P )7G?>Q#1`ۏ/=k?]|8_i@dm:Tb,F\$pրW5|}Ļ+H`X '٫-65DR[PԬϡeKL` ɂx|l+j?ڜFɇMV7~?M䦶MQ-Vur{ XѻUyV!{޾-agw3pKWM.Kiᷳ}Dg2Ohg<(-ba Q޸R0+Zs*o5@*1]ɨyPK=Mfl- /ɀ))y_/6"DV_EcAZϩ4gi,ӤM=֏t3]<;Yws>M:`/?#ϕUi6Iu!Y@zo^7R3"<"tYZm4);6(2E1`S!i/@V͠@.QiCiT6!"l~X)onW^\/w&3n̙?nC⬁V⑁bZC(?`y\qԲ)IJ5\% ,b L/1;/4 N:%ɖ-N [Ub]qBI; +ηJ iWmE.$p!kI{4K?9gskAz/؂#}b tH,UrA*G&mz(čm Hxi ' xyMB71_|nZk,Q;'S![rrTŵ_.׫vp;mk6-ބwB oFtV&3CjzFx+tbqZ⟓?_>6k~P[5]_ߎf1m>(n en{៓orYch*uWD٧x>Ёy,/5dV=o1zӓ#tm3ԶY9Ԯ۵{b5ĮM&nsRiե~;OfA88qN4g5ݒN. ̦B]K2?~ΏЬ,pe"|7"e6g{xnjki;t0qoO.6hwZ'R''\鷟 =~P&IZ`^,}:=$Z_rp;uyӼmfѵ,k3l.ώ._b3;|btsk!ђ~v:n;=Oӕώ>,{y,mu=0@˻W{y_5{vƻlBM¢J'Z&;.&՚]Wkַ,vx_NY8"Pj Nm+QHVVHiTi$.,_XFzU6߁y'\AJ!L5zWsAk&KM†LTϦ-|QR}TM7^"^qDm|=i;vGǶɪ?YRϻtqYgN_o|T3I.:Q3;G|Z95Ej> l BZ颳3$U $*>T\d˥$cSpI`3oz |Kwv|p_lFJĬ`{n{\Чs}~n"4QY$ ,_S߾-=Y>iePӣF=?L?3~|dLh-|6} [//mL6A}{sQvhs_ggӏgkMے 5aC3.|0m/ <;'2#.8W: 0rA4N+}"ci⊘us2 i2z*m@@mǜ5*DzF+LvJv%/%ޝz~64BR6㿄3;?rYؖ d7ѥJDBXuP##"yK`tT-9K+}܄2r.$kGK)X3 B$Wʒ FUI}ho⬯gEs}'3nCyŞY>dtfٻbG٧~n(u8b-Ά:˶OvL}{ { B)D6G:R2s, blVhuQbu> #]E@I"-E| k I(T%\Jpq7 >٦jyqzkܳ~nXZ ];4=x/c75:g«\}7!7@v,bجdDǩ]v1tĘsU&=aw*d|O ыAx;2L7ƒSf'Й40RX)”ġ)`Jj; +)gR* V&o$G5bwf۷{F.IvӋ]NSۖ<9T;:5R Ct1 p|QGӉcׯ%b U=ޓg"].h'lA9J0G4?Oj'} jC[xYX5-m,71<6KN#,-ʡ -ڬ;r-7=zc-ޟ_^/ "|~I7ɯw::g,-w.Zܢ:u6J@#xr*E?88o\x ]O,Xr Y-ѹS2 RNI8_k=B{4ZBM4᣾X2eY<ZǬh;<6lڷWBM@;JJ\1U0M21&Veu*_)Vw賳$0JM=Bh5GpϨ E4;:4;De|H$YVrY -1'B&c}P'ޓ񉝋Ad*^pƩ(qJ9%dlc98eNJH >BFoaL8^ۤVhVLj)BQ+Qd֖]aeJ {NQ"i"(14JTXNT/妤1Ͼm.Aa `ԑBV=0Kɲ"nGx6R/3xzH`rLaψ-OXɮEEEn)PlaZ̡]GiF*A5d.(j]v %d!ZR5%Aa[U( C!ޠ!. 55Z0U 3Lcé 0XPC렰¶bC4,,Z=ZPV 1ӗ] (<J'aRMhHX S؊q߂RE PWM+FdFv-mfn8@*&{@:JԐ4*#` 5n8`փ|Q $2d"ȫV Yf(LtqXX1G /!d]'sby"14q6q;@~|ˍ'[/N׫5|unʤ'q6))Ѝ6y;璸$c0K+Qgno|4i Z\|،=[֦OUV"=`&qƉ?zaVM=7fߗ2mi^>q: s- ER\cf$)#nR 8<93(\1OI6n܎1x ccH`+wpxHM@XMGcTs=Lq  Ubbu Cg6aAʹ yB//h"bAT{>9,kJgnXTzM!s9ZF~7fŦB]y/vOSv85>Wq?hy,>(g# "|q_0qp5 QWh8^ Q_}\Hfs</Nn74?w̗ݯS c=)^`.ƹ#Y;YVMq}qhGuרowμ|gzuw7zB/GAǯduT6þĮӦ4ee oL8 &̡CpSWa7zl(-g|s|j.ӀJf9yTCm͌/J7{t8 4Ͳ:!Qݷ3%^o|wSy˯sO{~~z}~9vsG8I~zLDJ&YՔD2j)IFǾ) +,lE*RqFWLKbJ F]%IfEbX`]lLj1)+fcHEWLk]WLYr2]PW%9z%+fѕ6 ml^WB,rtU%\~tUGp6B:*bBWtЦ)!iҕMQwGuٵ+,tB]REbaPܥ:ڧN<^WLY7]PWSʊtK^@buMWfYGEœKjtŸ!-b'x]1%a2]PWT<R+Tq-Yy]Q w,zte HW\֢+غ2Z1F]=h*8g5bܰ u[וPM0]GWŹL3pY,J()^?;_<\L..Fk%nvꪎֻe]UR2_WW5]=KuFWArXLW+U1,{c%pNjtŸP6u% I芁d5\*Zt%[SBcMWϣ+ 3*EϮ7g-bZu] %X1F]Q"]10D=Š5ϮvY핔j7]=վ*FBu6×O 8GGIu}3i#ҕ&5O Q{"F]4MMܥ7쨢lBMWkU! ѕFEWB”ՋݳE|8- ?ť= O"C^v^5b\I2f u銁jt%Jhy] eJ 銁!:5ܤdڔ/2C1]PW=]$Eb`:\-GWBtB]Q({y7ק]|evqL`)R6c3ɡCCJPR y C1Mȟٌ~܏ﰔ]QA E5ؼd2 %Y ůgL(Zt%O]p?^WBY}urezg ] R+QjʹDt%9>Jh}+gZJ 3oGjt%Y MgWB6z9lx >ǮLWUe='piuhBѡ:ʲBWtЦi@~z~z\/ɻtqqU/c:.E*z}Pԅi1Aԧ'm~{v6?}EF?m^xT'ぞ|z7ߎX"s]n_#媙o_7s44F%Ƙ4m0OX~6=%>!6i_"=N!waѫ-'ˍ/WWJ')hw?}/9:}|wS9_.pt/<6ћ__o?_}BӌJswCDLssJҿ߿9r[y_}w?omϼ-og7Ow}dˏcg׿篻#}z/1XKՃNC^;߽0 G3~{6o=ĀIݛrɉ/~ ]ko#7+BYty`v44`e[mYr$ }/K%YمFlGE..!//"(Id&d&[.)Z9<> n8YoA~MˣI+?Cj̀Z֥JjaR"2٦sw|SنsX?ΎxvԾɈ;Ү'CHY6D|QQd@[(Wa|1}rST>{|bhSR&aJ$ø(1*[V30;$n6B0YS&zLggQ~mX%U;4VRmx;]W .Nn{5n;'p~M>zD %`ŋY+>Yks_+V:/} cI"NXj|Z85԰ؚYx'D2^a@)gqejD)&H^$"Y&~>"N\ʬ㐂Ie]ry%XYd&*Q3IhЉبdWU$$aZen b DHч@JYJd9'w*h,sX$:n/d?#u[rפ/G ܾkGk\6i#kNe>V1A(bu:%3=]DB! I %a6^eᎊha8ӫ&Nί?[m.r'8k5su) 6z:XCOэӫph힨+ݝ|rފ*kj*c2[ʶ\gozW6tsC>3+X*X"F{n#0B28w4NEFmMN)ڔp6ZeRv',DJx%}&9J5,wBbaW_+Jm7iqD& ]b/ 4~4| g׎ؒ'ƐZP2F < 5BA"8Bȍ)/hSwe(Yz Eę ^`Sh Z&]2.sP:#f2[˜È&&1 j򎻢6TFmQg<$x_s[Q?tsAjqWD#>W.NxM\w6hõK90s'*zPƴQQsYH VF F4Fx2eeDfaDt:s%wEQE=.x#RtdbfY\Ā _ z\|\|.xX;2ʻ@**l{]1lXً068Ap}ksʇ~\.l2n.? aYF*FziԶQh(H}={׆9ZEL_'14 .-w7cs>|u}P_s[{qBL}Hr ?NΆ hˮ`.Z\0Ϝ) gRN$.NEoS ?ގ$/ܝ=|o]%$P2ّKRHՏ1%CïM㸂 e7&jHfx'BfG|/j'}“gI*! 9Dp/t˭Deuѱc C0OlN Yӝh>ZW!;Vh~<oOK/8d9xAM9--Jxc @$@蘒4RCNZ؇\QG\+ϩ0^U3p'^ŬX+̥wq<YhJB gCڣT75KXo[zDU1܃%+x)G[?m.*V̊Ae6hЩ~1UĹʑ[%VѐAdn0@|.;) \CN)&8Љ #LN^Iʊk9X&ׅ7,ŵ#[ eωO)f#y^LG&}(s(í@at2Uytގ5 ^7 `s[moH+kmn<ꙛotν¯ԗ&OY t&sV&|B ZBҡ_w|uvPzNY#h?Fðr{W3E/b$O/;γFbhJLR,P_ 9 ^PDNU="<"_2ĸ,69%u|R.JG>1ƲIr dztR*isBxF@1W,rFL>rn{m-vPtq;,Mw\b9 7Vdg 9O*੐L) $ չ:3OR=;GW{M*2owʐL^]4jNVvmPIR02MerI1cWݏѳs;^ď GƙLlr>_Q,c3ln:~I(nVoRVonPaF=V뿇ST˜6onr#.݈3MLvz;իz=Wij|r~n1>]I~#?߰!0^}QUbz^nVUew6:?phq^<#gkϋG,{zٖw;hRxB ~>}y4>~JZFɵie fI}N>'3 ΐiƲ-3z@z,x5/x0s喇eŶC*vXzBGI&~#&R!Ncd,,!IF{&R FrFƺ5:P9FQHoѩw9 n3z0sOSʑ>9{VO&cڋ_}}l3w^Y>$"u.2asA14\ dچ-✺ ޴?Q&YM |Q;^`8N$]L~Ju?k?eYH92Yֆo4 H7YIՊ7 m6}P"b@b#]CZnwŴ,]C^=`_p[l]ǃ.w`Q24=xyZ^,./|a pj  ]w<8wb2$%\:7KȎl>:j蛏' 2kM:Zƅ~ƫg錗Up(K&K3ְܫ`EmV9Y$ze]kU$:h]R:CB%B^hj/v3L˴̹I_B/uZ{6y/Je }O>~T~]mo9+|=`K-@ev _e#Iĝu 98ewJK \'zhE?AY ɘNQE㉽"{qUE߬>P]ЏZl{5Pwgt=ű;rouHCߞq~B_TJ (|w٢,gmy oXmPKl<\FW nJ;HH "ifIe.q\`(>P,x#+5We" :4K@{=HA֙3pxCјa#A=-šrrbhaLWDV^iB \G' xB1`i^Ҝ;!@p"1>yMN[Z/ U)唗Z)m$'uBbS,T9r. חΗQrYQ%%B/Oq'%3)MIo=/}e?>d]7~).:%Ӎ<:6\p_H)r"&)%,0]:I:&In1ɭՌIp'gjBg`r(iE  O"1kb$ ;n,M1(*g$LfgjPIEXfJ¥, REN9;Y-W>1kRo`\5di(n0(Oy2l%FBண]AM—Ŀ{tLAO?2s,r5j<*yiU4ׁ8^h&r9z!R*Z堖Ig'=.~;JANP&邐 <(RkN26 0g8#BlSeqU(OoGl ĕJ  BIr!g*:礰l'mNDqr&|ѓcb,ܑAgs1D_e\jbF{C5~w?jkEi#+E1"D(Tkb2MUy"@%>*(N8I'HeiOp潉kC tQ4!D!y ROtgLKSn bo-N\N#ToufG-Lh+`Dw}=AzDhvCu]h/:a~/4N-z?_y5 ?'~4ΡL<Yᐘ0Bh|9+֟u5S冩JgSg{/ٛ?}uw={e>{2>_o#$ $9?Ъ8^34װ}hCXY|q))9q~w\bPb? v.&{vw"qlclə h(D|熒9FX,̀h=|YkRޮ%cn 5Q.Mg(?qh]Y:y ch͖ȱ smw;_f~YoݽY/O~s54/yڛ0LLzyכUK9{ ⽻fsӓpzCnݠ5 j< o?QQI?K0De L"U` bP]guz9\ Ϋ%r5k u 'V:hy G2 \{=ǧKŻpJ뵲8}Ni*-R4p+x/vCc >ʅ=W^IKI"nuWR$iof Iؕ5s!> H }ܾ 'ʨb x?3FXGJN}}bmƺ'Ay6C}vrQ!haS9_qZv.Ȳ)0ǃ3ALjM<(Q̔)5K`,c橢y \waTdOo|8YsxZZΥ-eRL3. cgƤo>R_M }+7E?j"^e7w;R %^xqCЅEW ǽ' C0Ƭ *2z%]"}B 4^_S?j9`#S9 *^+(dDɁG c'Yɧni;E7U~-LqW9Y]Lڜnmh6eY\HIt30\mƞdy׸F6ؾCڎ!A\iʣTq.izig6 O'gRtx޼.Mgk N &C]Ss28Garv7N&kqCǥjҷy婮 k]51hUڸCr5Tjkǫ45/Do+Vp%]}oy03ML”[wߺywrs%O_~]$- -uu:{MB .I f'# 'ǭB# UUj-[q+*-w \_{w麭Kvb޶ۮarYݲlU1ۓbzuܾO5̑ghQuC: `j&doB&䁚=&)1Z2 O OY,$ᴷNgX4 :VhBq]:`X#VG>*fr`)Ͳ٥C_8;_õ܌Y|KtX̧_?{0&^1=^}&*d'!h4󣽇bX/x2&lLjAhlD>̝ưQt0oOZ͍ZTxú t:I%"StJR)5RY)$ qFcIdn(ON=.{^LWwX#|CyA ; F1 ٧y̿D' S,#XE*OK֌]woA|A?XkuNЅ{뙌.GCG[_4nOҏH.O[演>ftn!]h*,[@f{$)&'ae *d06&%za,Y E&M"k 'QOI2U#gG9kW f0jirSZ8m0e6h1rF#Zы3\/P;g5Ge6ggv6)*hk1q0ZVNz^v!`GmHC%B .dY R0!guGqX jO ^A.4#C,K mb-OUB8lFh:zi!.)EEȾoYcH٬6ћHTEkt2g,c^< 5ar~u(QC,#:A/gec nt}C閦eջhD[$qb`eMIV @!EϳvZaB 0 yΙ*>>奔.4䮧O&/y8"F@R''r!Ⓟx! |P y}]tn|ŕm-w)q!X҃S8H51 6t-EU"g+>u_CxrrZn.ièp 3Y3k GV42S1kJ:ΤJZ;d MAFPYץAXZ,r**XHU 2T]iZ]c.谄 ]?rpԖBRR)_ Wr^et"ǔBv5=?ִ JY"+u6)D3X]*(9s&ILq#SH R]A`Z % g]9CdyQb*EY$8d)ipm`W<.{GlD9ŹEX񑬹Z<"'b0&F4y.|.7a`f{J{$%q`+y cf%kJq c8dk2,삔Qa %r@]`,yI,ҏ^ ç6S4(䓤\=L\LqNWs*S'㵚͛>?h 2Jأ]4rmK#6M.i8-$M?]Cs8݃Hmuί^x{yq:8B-ܷ8'g+ٮ9ƿ\^NxpGiWz]3I3uM#:QwN,HcOZEO<ŲOƣ僞]9<^ܲ8ŠsVnuȮY*Yg%_>-eL}~LCrF+GdPmPW0VVʏ_.ǿ gaB_-Zxq_yTnq=l9 bmڵP%-K5ĩ/q8_) ɟ~LJOo?Oݧ?O^.R(+HP`~}=wcj[kjo9uDU/rVs9/yYlUtߗӸݏr8ncj#=įzDC 0Jޕ(`XV z%Y@klFͭ=rL ҝ4T.z:x dGޓmt*f)At lg,_ ɒւAQp^Z.U.%8GT"f.s.{3Leo\5rv2uuLR/WqS9xXjy{8f2[޼ÀjeFf)y\ԎYbc.Гv FL]ܕ[IMkmFESHR$3` 3/ ^mŲ,oղ$[+%KvŏUźX3;=l>eptyt5[X;,fZBwfof8rS^M|; zpV>:TLi6Ú*BŴ6Ù[umKJk3 kz/}X>at~&cI0Y":F>9c ] Eœ>hi$ qnFc$Ky#$ )C;ER2F. xE{yLW:įjgFݿ/=$3qd)$`r귑Q Q\D9RΉ\$}e:("i C>E\k DE/}Y-܌#.zdlels 3.f{ 7ɱbO}r)j5/Ӗ!k<g4>]V)SZcƠSb@9&XXNW)$cO,, aQp)tV^2`}e8Փ?P`yП<(`)CQCQRţ-i[E3K-fD|U;,|Ťd_\qw&|@@;(]D[X뤤˟ cIyK!4 ca10~xx+X"UYuAp]kh#qnfYM}dxʄF@ZUYڨP.HaG!~K^^Jaջ&Vd:k,\PR {q %8ä!m'rB:Hr|t`߲:3aȄFJU܄= 7^JhjR`|uXM8UWDΨ tOδ 5VYrSUmitiIcsiLO+Fr4)a\b5׋VHif;bܥpp> 3a\cL'g!Q$K1*Abc2JQ:l6E%Khf(j3 3),5rT@ dZ_jQ1v6/g9hu`1ձS~[ eɇO-ȇO,nrKR]Yju;]vEԆ_P$dZI]q12R{bV҅JR[zf6zsډa-NnvVݸ4JW}_=، ̰W]/2;\!tp!J~0ɯՇMw'Y,bU>a?86#P[É2,V <J[*3*zy9k owu*.ayyٛR9p~|i~6Op[pmw ~Yᥠ "H)S!A]yFJy1F{qIy,;z[s~2'a; 7oo>S8\}o-~|LA)x<{z=g_ @ *7k;y j48&yd.]&?D%r}dol.(ydoa32C%f1&wUWˡwvP_]4ۼI[jRI979hg{M^>+mA}z}%~[Cg4pu>K땟VO? $G*M Z^9l%)a Fty]7n6p3 5\ቩNUvnFq#"G戉;S8{=&+O}h>awւsI#GҚr>6%WR #$iAZ-xw^砼iI‚Ҏ0S*r(-lVq9d ڵJ ]qzS:<{օA߭@'(qkTEC&neXDDňj LZpP,11R\k!Q! 0tp:`$`$宜A@IxdFB2^hy\`@:##m;'go^+}&o7/VSG'Z8«UsKK{;1HJn Y|DBh8HQ)B}pN6t =;[rDr2 m!BD'H8&*&QKEH6Y;vrցL9@?XDRsɛiSYș65ݗڂ"l+?t?scZFӌ7bD7M h{t3dh6qT'>1>6BbpL雬Ѡ m9hfi])BHj/IBRk4ש.H;O8ͱ|K}NlftdzA$a, ԜoρEh1ڏuIz4i#,xxL~Mq[XMS˧߶ztFǚ ZK2{Tl4T~0tZxG!h~Va!)eXnð[1b:Z)}@d )dsdiۘn! t>`9*~AC[y/%^(KSBt:_ys6';Ӵ GwhZQ/cQ(9&̤Kly^z?OGqF~uSi* xqNn-&V/G''=C>m?zk&fYhjb׍qp=LF[Dm+ah56pLn6Y;ҁF3lsÚβSwLpWw}c"GuDJk&xQ,%-LFN) KqΜs9Y(ab! s}9~?1M.vGIsKj>Q294 /qu{">$ lx,>TÁW7[L6|dKr^}cC~H]zc2/ЕxpQ2Ú\Lm̮RD{m%j{ XT)~O:_Ieq\`(n JXTdŴX*gHŸNzt5{A*؉j.$5H[|7+)YwXD3B5xЀjx خ)>ˑ\_?Wh|^~Zf<5ʻ.zoFk^o85w_:e^cx7}3﫢[%nqg\ROxa6rg0+Nʮ!O#f^-]/?/**fPow[zZKsINy ̻8cp#sHDLﭿin%5_ > 7MR&U4j$IA+_[R삂E#smO a2dZAwmmHW[m/~؉yݘ 8D(۳Y$M,ɝ[,YD9:|DteK簔A^ nV88G]S˵1)oB;~kcË&y]jD$ME1Vtb{:uI2 =%gisbWҶrF)]IM 5ZJ!Ŗ(0*JYr:aQ5i{mϢp^ai#?2эEKj/C?fM z$8[m8v(˟:#}^vTxK]X =;-&^2cSM_z ua5/'_r5kdoAxST!K3D[IU#$RKG<7,3.cSId30f"FnP2 bpx !hd6ѨKeeHhB"=ݡlW&Y&UB%%snT0+S>w!dXUVZbxsv.!萅(B82eF͙E:}WSDR8FNz똑U2d]DrUod|h*^܌S%Q[+QDE+$ h:ќCkQ椄7plֱ1LYumG;eeBUCKZ4ȐH&#[]F/( I:'틤t0nn94J4)*li_"IF>{,.` &`ҙB6=U |.%ˆQaJ$~)'@ǂ ָhP) "ح(Xd5wX 4shltG*A5Z,PrպJJ|Y@hLy֒ o. ՄXXK:i5 Sk5Zp7g..x0 VX4m:tRZ΢ P"d ֗]R-@P&ZpV:!ZRDcb\6b -ZtUTiPRC̨Pِ@*\ǴA\X L. dHS_2E@lf22i vEm%˲f@57PhA1@Ơ) eڀb( v `fQGܤ ӄ#B( ƐL5(&:&|K@Z ` I& 0NVj ަMtf+%Z92zh`5Ho&Y#ziW+[hɻP ئ `::pԥ: (}v,}W2Mt\i!i4*UF@9qL Ρ'`oߘ4XP茼p΀Ԓ4D&Lbya>xmBVV$LtqB9^Y%k:h NmBbtCJ߲f`xNRlZɓ˵ƒ+ Ul6huQbu> S㤊)'TOq.ۍ|pNeNʠrq荴7ӇZ FsİAhilllllllllllllllllllllllliꅛ\q;˧/n6VoUf%[$~?!_b̹29}>*zTx3{F*"z53cLO;/Fgf`JՙDk(b} U1"iA*I@o7UeM}5mu9}JTq\z5zC ;}U u3w5jY@ܝZ-R+[|2 k^,Ex+nI -9@Mrs:d]\Y;'2׷XZvßW>~Zj1Ͳ"]˷͸! [qeO>G߷!J,]ٴ6=ү73Z ,> G5Cfjl-Pj0Z!zk A%[cƱe۞ |iOQrrޔ t3BLLK<)DrҍѬG Hjm7^Ϳ_֋NBWe?rXߠr}78cO.Osݧdү/7nnFV}fVtj)&`s9Dpb߳*={ۄ! 5BѕA) ]888888888888888888888888ÞWÎB[Ұqv4ׇ7a7ëװ#J%XG԰[:vk~+@kΌM O$)p ,\1{i[oW>yeGeӎUOT7rm,m!iF,mgi;KYv,mgi;KYv,mgi;KYv,mgi;KYv,mgi;KYv,mgi;KYgHD퀿ƾi;uHm^bA8>~k(O_w%Q RE AZBs"(TJ%cIbZ`'{= {iVZB%c}q๋'2ҳQ(>$1HmΚ_Ϯa\S.QXY)J B4H4(AiRK [4(^zH5OSxF$ &ˍD*FE:(~c AmcoK&O&Olz# ~~kq;Zn/AZn\۞ٟ+JՖ[Un?˧~#~_'?~:̜|ˇO?^)tuIa-?%"֥Cs#H J0.%%oyø=.bغ1~pۃr\Nbf4']J yK+Ja!b,s,H >tr[ja޽4(8,K, #yOrfs0HƬ "#R?(ĉe!Pi|`+uaNjR+ݺi\ZWaRsQrfo.ULYʤp64B98L#E<%hK?[`v, ٱF[j[**jݔIMf]@WXx%02iNYk>f/wVQ#Xa9$ gFuԞqLC4Y"da%\ qh1ᣱ&B'-Qy "dRĜ7p`WLݓ wmh 3c4ۗ`}qb1ڭ&7N5:zȒ.QNNmU И([ Vx&6= , MrDB j;*kM$0 EJaYB`2chgpk (E;bsKIn]$Kuۮ ]F25nJg GĹ?:?Hxߖp`ƓӚ{,|eBLI$[+1q[Cor(+o-)ڮ3Ѹ8S7hƚ4euV$d82QɆgC4 iSARv $K,A2"h^ȵ46Zb&:HΒ9[#WTE8b2< ˶I yR23C7Ԧ༷TDY&1Z)ϑ:@;+Ar1qv,p;ߎ_hc(L!@9JFY&3 D6`l[D[SmH2K,А#ьKnK2Pz8;t9K\Ѝ:Ep3*y&-&YW/Nqu)YLK_/V RJhx%&kxpĀ3 2a3<3\Ii'/vӎ}(~ \ز*l&Wi:oȝv~+W(v(/SQDP6 [IRȘ2kYTƧȪ6PoSMvy vZÛ+npfѧr5׍)|PAR19\(3F 9oD!dh¿{>ÿ'ؗ <7툻BCg°/02tz"qܟNxl]'a[ųU 5M3Q+V))*hr`I-_ ^ʓ6V.G(4,1_ҶuQ.ԅ[nA 6PV<OOOgy2s9~br7NFEn&Ig[V|*aE}8V̳ɼGuҧ `5Cm ŋSRncUUnV:@T$~Aە`^v%K9fjcvpo+`&٣ ?׽q1|՘7lucMu75F t[Z|s' 9 ['" I*KQ'RLRjF٤LC^?BTy9;s͌9r?Ȏb3^"*V3߾ 3Yj֑r}x;My^-S4 Ps>ljn&KS55F hX/^G-9 QCL(lb,h<ɬ%2t)L H .i?#q7^{Ot1~]{R="dͭ_.ۦ?~z,>YpϓIsdmj4%x cBlE|:<*ʭr+BY@)S0ڔ)ͲHރwE1: Cc/U&Y%"rYl7JZ[]*ګGW0?]sm @!KQhEEDwA:y]LRBaaJp$ %`d3IiY zTB’tN)V{E%rz#goY@yireyw+y,K-ZU|6}77?]z0!]d `! d0 9GjBZkE5%F 9S 9NĬ<(hR>.0;+Jk oFq])J|l SHFlYo%P"'Kߕw"g?]W9+`1;K764~̋YWUtSسO> `$ fτWR%]O`/zB% ]t=R>^p=+58iz7y:)+d B51)lD$P4`H>--QǓ^|)Mmhټ+9"ZBKDȆ^>ZI~1.n >g1Sِ. Ic'0:R|iG7zSWWyiڛ]<;3 r{GuS\wt٩:s6Rm^KZ2:A_J F%R2̕!' MrF5W5A(BBTj $V(U(bxC%LqeE^̋[!geP(R#mԳgֆ'ݳF΁ro@V w0*?~ Y:Dr07J\X[/ZIޤ]CiP>a Ȃz>%Ec_*36ߚg !EEgeD@r,%T/Zq>>&9dY'8 lQ`N?LfrKND R|8(~Qs'^~>5ޏ٦bE;L΂u>Dފuq"8˟1F+ps! +֟Gi/ ٱ_o,r_Þo>rx`]E6|*1pMN!DЙ8#ZV )O<`"q}z׳F?~6/ꈛ4mf5LhD [$ q\ ֿŠ^0ԘdRj^Jas@y)52(宧,,&/j=@'_MD6DBv4#2r!X`Z m>gH8^8R4ǠTrHD"y_`篮ދtOOt[=k|aTIVYBV+Iz6F !&Z6K 0KMC|&g5e|sJY9DP9]q(7@$dXX"XcVCgGAܗcP6LEhf #g7<aXxZ()@4bSV 9e55&Ix$򇁖@x`'gZ{}ښ$Fikd%3C4'fU#1;(b~>O#eHRU|=#%G1yISD$4z~qM{ϵǥ~"hg9 p(޳5WViڋQ̚dDz?FUf>~[}0ѿ.ՓTXRK_'iaޏ5̉D6sJ()ꍃKqrR?0|28cjV)HgA6˝ü\krg^(=9P(O|ԧF~4ɟZOݛҷdJZDo_~F0THSăZM"v⌷|Jcѭ9-[ͫmu_<}lMp-.2>?=[p1M)}ߝzqy'^Ptd`H]èal2,)OXwdyг1/;&gvQWkZ:uVo界ԁ/ӫX*#OhHSם0<5NۭG_$|P]9O(25^],gFq6ZvT-s?S<, ~}ᆳ͇>|݇?{X3yw`$AX~7 v} 㭆hMB90}Ncܿ͸"*Ctq2qfi#=6yDC k XԊLl=zY@k nV[O $eTy]P.wfu8xԦDdqD"lE&fvƲPt02rsT1+΋W3W 1LoVne虯N5#VGuTDd4J˜7d}Un3[/]z % kD)0FBzYt()d&؈*hA< CI~Ju+koNx|ut66s[6i="]Hz̏,k.1lp 15%71 4  4?`J683&oR(.KAa"jrh ))WosP ()!EIcTEDH]%xҞa9fzF΁^n~ &qS9^W=Ob=Vom{:pGϳyh*{>+~\n{^‡w=}v`䇫YкK_kqvdX*p[󯭛ErNf{:]ٟRejrˆ[6Kn=]o|G7ZZ7s>秓rM2WtZvS'KozǑ5W_5=y{ܦ)˝|4P)-C[orl͍NAօ~6bp:/}ٞ) ylk @2&X!;-`?$d» fg5)-jH(N$]t,]KgҠ{ZaJ^G;xfr(HwG!`VVX1]y.ﯹ?J.Jopŭj{7=:R Ji2[*ѻRH ThL1l F` 2^ zzlqNNN(Ł1Y쳈 k8%X#EZAXJJPH5$ @Zk D% \wul7rC- Ja;ᬩu tp=m1@g/p%ŔfrՓ Wj5{J eb1$}X>b}fMPŞ`$!%yzcB6!YhtZ8_ plF21EKF2@"%Sp:iHyЂM='\ `qC:TYw$(\嘻tT%"D7ٌ3k22 -!@M*JԱ~:pTGҲ jd)TQl S46k'0}9$9j؊fsN1ܖOu[6*ymc 3/hȴ<9Lbt^zm-0AD 9,Wgښ8_˱O/RJlʱ:R(@JJ*,e,,~%`0L7ݽ3:ʬV t(GklG POp).}ԖxY+*Ny`}es8' 7O36#>D39 +I9@GH($^s A:ɸJ6[&O!9{D%$ڤB:`^e) Ӂi]I٦lfl;y2ݔtڸakw:Rb1V hAȣ 1& 1,J; jca1Y1,C0HsIR kd#NuḃS{>$b<FlL>veD0#;F\y@;郦Fd8a$DcZ dFU5((n 7OG!1xƂۨXA`Ir/K ε^Fc'be}0nM-H,m Xy$\2K#['^uhr>%Fd:Y,谘P-G~vcvp,cLD&p摦jnD8NxQ 9^$-m֓Ygv.̑\).p WBʂཚ~%^\߀S< QK'xd hj#r)k:Kx%<9ф'0]%zMX 0pI*+J ".FlH NnIӝ}]}UBV{?ڬ-ˇ8>> %8JM)ǘְ$Ip`!sh:!`siᚽ̀ydT )1XDX,-,xGI4 t֢lVU.I;*6e:hCZÊ\|znj!T}ct|WKV(WK?2c!r*gRbry2U~\(Ya` .ZfG,0$;9 ,ail}U} %s*glF!.j)VQw!s>J˰$`HKM> S)ҘlXz!/\n.H_ `BnMn;r16^5.i\m TxQY*-W2:{d`s_3K}aa-}D'U6+5TPhNSO?8G;kHE>ג)E 0^健N}F+fбx: sZjӹ=tn))np09sH[ :Iq#^*ƻUgkU.g"{-'rd\ R+RWآXAvb Cq\;]섲\q%#)" Jp SΐU,ik*ը-th NW %]!]t05t JhLЕq=:]mvp5?.]mV|%;1[Еjץǜ#[DW8CY*e*3+0US-+j ]%\ZNWΐ@ɺJhujqtEFututc"Je{WƺJUBIiGWgHW\Hx Xk*%-tr~tPJҕHK"#JpEk*'OWWl·JV]IгSu9ҕGm+̘h ]%5+@ɟJ( J`l]`AIk*孱ZyJ;z)tEP') n#;[%ǵDQjKⴜAդmGW,=V |Et++n ]a&NWX֎ƂĪ5t2JhUBiGWgHWTH]Y!`n__`|~<^/I([1S뤒E*QrCkCX3((dQ\d_~!&U%u(ܨWLCoj\4wf>/b?Ll Bks45AR鄡Ϳζ7C=6[~M*LԻoӏYdދ)<+Qg> (Z碬q}?H1QvJ$(gbVg@_߿>f3 e_R`2חٛ.߁|OYT䶶>C m)dp*]gOf+ g3qMg|v6{fq mtAE}! TB:t:ڃRmFZnm>碘F*89v0;s:ؤf!_\r Wܜ+̬:kJP/3T V̪(Oa2 k> 6Ԯ7)]PZM*$tWW3$+UG:n--p}49l!p܆q T *0^x*S2?' ¾zqX`;s֫yw0ǿlr6=e]ӭF^Iڲ0p ?(|0ŊIeiө.GfE6%(_w,(YcKlKEC["6!<*khKY:;IshTT>jՔKБLQ{e$-gBũL(r[N&%@iݼdj#Tu8^_%XKۗn%8(vX?_^ˣՂ{Ckjna* "w )Hl&͖9Z`,Ec IJqp =3 S>@TLe\cҔ)0Z'ݾ~ii*_<|*wH_nl:Nb;.SlaЀKY!:[]Nq[ Hg #}NtB/6ô҄(kٚ(|pkn&Ϫ''Vr6tky &Kڞ .oAHEζCPl]lem Y$5VS+@ɑ J!(j]`M#-tꓧ+1s+ QEtmGW10J"w]`[lc{^vǎ;c;,_Q~حR"4:q$,TEQEq)\3"㪨׃+̦'ehEW >*~䱫#yWܘc \p{\o!\:++ ZPmUQ){\E\Ղɷ; ּ;*rEge~\JӲEz\WB2yW$W1\ԊzBN3B3nVWEe9qu\UpUxW$Wp ڗ%**7~z;RRH3*r3`QMqUTZqqP{ܡ+d}ܪߏRWoWAa&2`-dgpUmUQI\Y/7Q\`λ3MD** qjplz8aH?}/]\njH/R̻p=mzc+ߋ"cW;5 XwSٶWU2C`gpU Zێm`J(+YpU\\ZmUQٶi=+H0BwƮ\直U**M v3uOޙU!v2SBbzd][Gh'3e1{3m%;f❶/ n.iVe|KCH<@ƦB";3=խRTy[ aFapE"t*MqUTB?"q ТC"WE]UQkZ+R)z2Z%\Dj WE.નխRTZ ʖlC*obI\JvWEmm޻zElzq¬6YvlظM>㻨5)n*eobp%z\T&؊z᪨=rhlrh=+P!\`#3*rQvWrv\-J2+mpEQv',rUg֨Tn PvW$XxWE.@WpU+n*5qqIP{UWpEj%kUQ ="40] I3*rUg+RkUQ)u7+K V`;"DwS[?vUTn, J #yrUPqrtZAX,xK:MOY_6pcjiώ%pu<-\Fz\m4Y.3ˑJKRR*xJ^T~ׇr5G<&Aa%̕uV YpX~.J-ߨi. y;G޹>IMbJ?zlqo to)n;.p3g}&5[~2J};_Sncf?qmh=Ҧ?w:&gLd8 G]\+kq*zkx򦬍!K*`W!g.Y^znN[BMnO'Ჺ'" ,{vH" ,S\Q[!AeVT`ƹ]1W{d9.J,_ ݆yԴz*W}Ysj9rls ;cg'z5va?Ο'rM (޹=<!MKsufwJuu?[mYcĜpP'O8j!|gY9u[_&{'z؞>lY1#q pz؍`E) LpQdyP_*WȺ(ߜ%͔oRʪ;j!*|>u9FYYOlV {fid|3(wag5 } ʖn|pM~1]a} xhh|e+f*p}LGjl]|0zH.& růŅFU *Y7uBv*D61Ddzmzm:QKK^'H1ZkFce֡`;i HlBqi ʈ,uz:f0xmJK.| {˥L2Eܑk:#"Enc6?yw!};TLrJJ)>bdh;eO1X"fg$ɴ-g\Q6$Fg)C:<e43MyOz̘Rp7U\~$zCNsAgYw8;I_ƴ!8Οb6oܽ..Giy%?t;Oo.'m()h|#&x"(ʴ̽vW5J9 kiǝP ɫ8R9&pfBzktdTF, Yp `A]>Q 9J RM19O9_16f{Y,\׳XVTm~U<-S#;뙑UN/vv3 WʍF/lډ-!5URa(C0|h< @+ڋ ĤѺG%j( `S(JF 6L36IZC,hf) ސ!dA A$Θ}B& &YSm f_ʵ rI2y\RMb4&& q$FcI;04~Bc[ؘ}KDh'c6蘬p*5^iP^3e Hk`P*%Lіl'g\D-!x L*kɄ@4H-2U&bcr&DM퉋jٹ4٘E0EŞY Dq,/{ȄZ93)L>$DZ(DC\| .YǾ< P@:!enftQ# cOu\Lg;>c\?X5.`F6ȵYpj,E%iUbB+Xaګ[-V?,,ΆJ*+'PRc?Kfioˁ*Z2i_p~>0Zv?J{)>nSV寰Gԋ]:aYog#[ڜҽ)#zpiKgt(Ȭsz)6<ׄGkt:AF J-x@3+fRSD)BM$9܂^݄f^sh#qi:۝F9}pku=W@@2A-q_FaLNT^gS4}#{K`,J'"0h2t Kg"2s"|IKLN^I"쮲|w{&ۃ/ox9AYE?1L3t1%DdC`d5jQh5u;o#w齽$<˫c-/lroOl poHA@͌A3`{Ց[kd4qZp=KsYk5|K,4Uߔڔ(hWc4{3G\块o^j&S8\l2-Fe!:r`"2O}s CuvIS'p`O!Bi{0`8 &Z-4 ,A$.)2\xaPF3526ѷEl+륔&'ΟɧRå#/f9E'b|O, Fdj"gJ@~A4!).V{4t~{lEOOW 9Pu1R,$>B,Y"E8ÒQyG&G{wܦlZs.o,E.%͐qa QPmbL̘ |{}cv`[:]bK}фPD5 R;l_j/Q&sr' {Aɇs*EnYERDώ6dM Id>Stc|wv?3ى9jqz^{n4Ti\a9)-kk^-8xn7J*]>T:0qmE}6Y%}@o)zV^Bw5Ϩаg=m%ZR՛]O7u]>RgՓLP;V۞w.j.2o R^ܴyٶ)>,gAk]Yo#G+_fؒ<?x m !)ѢH I b$^J-R*Y_dQ<g׵-4Ghu J.MhsW=con=iar~6YPCq=IrJ`%)%yI.\V{MzMOeDH}wxŏ?6Jh2Ӭ/Zyq3K:ߛf(2 3!w$4tcɖ6>|#3g V[̓LRfto5Omg-)3q7mZ,>Y^bh  úGN.Kf!}ng} Y6;L|fy&gZ;$Tx6B.$rZ茪6)$DFQrV'?$_PKh-8s^g+mp>+g)d4:{VT$݁)CxF@^+Jp`$Mo%wkq^Ņ8%=CP?OیI2EzB*inx7U3˥&{eVRPrRJra* %H$lPQϖv̦kL!C~VJ  )e:'1BL$92Z@Ƶ]giVEE=Hn> Nj4M鹲D~5!0zw?M\>~ݫ_VhI=rN!\з~M}O'eGp ~u~;r,g%3N3m9qvX; A=nl? $͞v1n:޻$̃?hU:V#ae ȃo\";lE*5+ݴĵho -5סn44|HSӅ;E ՄȐ*4hPM׃܂(R)" $UV8PS݁눋RQFC0cJ;'<0"ĸA5Q![Qynb[@=1ȧnǏ@ZZFT޸v B<_b=-6܀%+ E 𓃀 U*'v J}N>=i#k-qtZ 2J!p)dƤRL>s(o7T )Q.E& \gȤU,pZз00N9wkWPm6ɦvcLWa`~E>/} o\ Wu>k $_**xk}1s?TxD<ޘlt&q$s, iՈ* zlR.%z1K󾔣ILYh29rCPym-#_C\p:\o?Mjw7.ןýY{ nI?K_gro2oi ft zZõS5Y="76+V]BS_ A!,}0O_fOXPakgA*s}mYuJxw`h~ v#M[n{o)3I)KEK2/>u0/cs!$ͱ}lw'LT! [0]9DYYXIw~}<ߋ7;ăLJ%$cރLgĐIo]JG.Gx5WdK zd 0"rn%m'8jftѡ+/y򧴾3Q B5e7\GijjuiObL.7=ruCeQ\4Kj2d\\*jG 9{QY lX)Nr[ܪ)$S,*OEԚ k !#hAdeh.E l9 ћ?Q; ޚ#2SHɳijLI`  ŕ9b* Mjj؟uvr5c\>*'#H"\t9$Qx)esvYNq8e[R,Atf2u{뤭fRN^P5Wl7?ANҫN1e6tc;?< iA"feqVYDMBbTI}Zqk4IdfL!(^yX'H >GS"L(-rʕt]"/JL%:+f3>JxHaq> F~~~{P>w Q{?-a{a /-;i}hRNDlU{.CE&.+{on5[LI ^VͽoYr@zcɻP{ErglZM.}3MH?qEi8-䓤G.saXo^ϽkW+fhf״Wkwn'6hܖ eNI)rSknh%$nT9`~Stox9 m|_fz\x Ve+?̃ln% i.&w_Dɲ=Zպnn,3$1 G,XVpy>ѓ>Wk{lsAuX%7xwq(MLH^9:'7EjE5ƩO7o0˟gU91%eaGz4Gl?酦UڵP%j_gCg{}AB_~oP~ٿo>|ƅ={g? 2Mr^GIד [J˒eUI**\TvMţS|K^^7 3I\_ |WZ78 Ō9[ ΉuXԓZYʇp9#spKS˘0233΂1)PmYykٲlǘtiwi5Ί`B cEt]rI ]b+th%xJ(IOWoSlX JpIg60oM(EJGDWrtUB+٥UB{CWWHPytu`(qh>]ˢ+~]Nmz wݡ6DW -xJ( .U,ig*U+tꋧ+@) .wD+thWNWT2oJtf*}scOۡ+ªCtkwF]% _:]%vrOW/BWBd 9.]+@1tJ(i"]I]`BXg*eQW7J/EGuҀYJpYgUV*47ў^g"v\BW>J]6#u~uu`,p\}>{&:%3=]1Le Jp ]%^:]%5׻]Q$9 0#3tp]+@+(JȞ ]14'+13t{8UBYOWolzδ\vW7MUY.Y9L?꽔 DvX݉Le˻n*j+}7PXVׂ] !tYN)lWղH-{-:Rr筙@S*ɡAnӵe%5ݍ鯷~Vb},pYo{0J7⛻/O3aRb2WмHǷyӿ8z+˰L"U2W<)wΛ<ЌGߔȰẄO@n\x/P]e\`L8Z;wa;UʛmUmO~*zUNn>ʦ7ׁ(͙8+Y7F"58b@@ dst CtowhzsM2nݳ+A/; ,Rg2DOsKgXlI XG 1ʗxGI4 Lm '}j~UuD0_dlߛt꫼0OвG&ZLK+~z7dџ{;%6o*޵nb:LWo K8-yܛ5`OҬ&r>ܞķo^QS_~qnYᏋ,3Q`Zrwц?NƏ}{q(7vE]XGT #'EɈQ =W`30!"\EL\'y?x[N^]YM|qדb=D7+k>m / 0Yqir ̃Tי8KUkCPLVpRkrYfq3x#d4\,f9& ?FKF!Jfn'bs3_@|HԿ  _f#xYe} =]xM'r}>HM OyTFX|/7DSY\v&Բ{FF @Dg tcFs#{{f=n>a%'w/bj j Z#V\iUϣpp7)J7,OJ8at,x*ض=V|kuоiDT䭕2h֊yōWFR Y Ÿ1hk%^V1aR/13)V2Ph2+C P!ݲ:n ~u\5.bR iRy9>#[geRr v)^N[>IlDf{9{Y3st~[u ),%ƔcAeoB#L"01F0Q^X}`ݔ cPцpHbhDQۖ S%\%z<ƃ&[Ý gd)ϴ2jP|45 Z97Ƒ`]4~<;a~;%\;/#ʍ\|GQ>EIqD?Uq<+=43@&p;7cH~Rwd1HK]qB:wS wzPl[_4e)N,ATGc* Y&??{:$'փhʱ _7& 2hb:Zo<|Nly림=2?;g gqCF#*x=ֺmRTv~2<:_^RTPܪ(uU#)ΡB+rVՃA ,T/Y{6ăп>*Th vMqc[C). Hqm=B# f0HNIָO4A*)U!u91#?|$whx|t?/h,*L|=L<ɸɿpnT&E̘礼D!Thaw0ޟ;${L|obw!(IGT߽W'odia1e7QyE2뎂}U`}Eٟ@ѯ㜺S?DS=i7$CQcFp8JBCr"1*AD\Qc8$QJ1,?D_PKh-8)?w4X+,NQloG`Б+GE36rsAj( ;!s>J˰$`IK=ŏYw'_~wk}ccy iؾ-‹smI|:Ht,Mxis tzRnL)r\(fn5ȩԩ̌^p2ݟ~B4(N^(Tf96qԧ}78ł,8/t|t<(ܮ_6ȹQjEw9lVJH/B"ۖt/+pLo|/^Y߾6ߣ}1 ='^ItƳIS"?Y WCO~9!!tt ƎuJ%6J!Rڐb& ZGK! 3jg>sW dTk%iRXgM?H*-bU/&&ϓŧ}Nc;Y>Y鵾˔G$c4I/ -M/U mgYL[-tmq*{t]Ump{!C'}tPᆩ,*ӽ+eg+[P7zUOq-Hd%7sk]*Uzy:8?\7_ZC,dʐꎒ=y&XM{fóEv4iҽפ<ZyygBLBR]lfyJJ[J]k%.{붞}:qIm|#I_j!dA6A5~H/)_E" ղ(ItMOOTWݩu{-nn3-t~<*y/yW"]wyTsqS7&1k^B]z8:;ݩ3t6xnq|sb lͩ}j BcRZwd`+uF<&}KjRwHG2Y[ŀ!Bb$@^c>4IPҶ-P"lѐZZ!w%q!SpB8Iikغ@I3q6C(5dc^_%_5e;^r啯tʠg;hr= ?[/oP]ڦS5 u𥔠UR*%-d XQ({f۟7~~|cIHJZ 3F(RAѼ KueEvE`(S{%Bl`bTM=[2&dAc;k&Ζvտ-Xޡ|K6&RbbLmc7:$:LY@,\""_ |'W9Poӏ=?5ABZEFo&$ID")KJ8PvT+IOz\aLP{H:3$%f+"IJ R)`<Gدƭt^R xeZ#VR!R :T~ǟ3u !^9EDXI_3-^naz[P9&|&#Bf/ DdňF By3%[u4Ӑ]55A/ua~>MOw,ÞAHԄEB!䑒M@>dUbF ( }D:&'a\a^Jk$jE&¾$(ב(D`u$&![APP Kg,S(|PڲR", E@xcvE TmFH'j}d#:!#{8cD3M{x)Wt?ŔƍU|Y&?5YYu%euŒ^r#^-[#FlXmF ~=v UQuJO~;\ۨuR-$MP !=L(Iw$09<W(K-M4QBb J"+MU<e0(֐v]v@N 4J/²x \ՒENy}{O!n/%/)4̽qX4i셋0B ʪwQݯU#7u=%O2#I?<|PO0cϗ50coC #]̥Nn'[nTqim$3P2BIQ3>Sk$#?0<3Ϗ-~k֬]> ™lw?\.yBh7:=ZG]/yY[/cr("Ted>K<ͯ!˗W }d*=jVgvkԆ&_V%$ONT9ھ]ٽ5va̎|{q~Urv0pg gs jnW$j.λJArry'bH \;̪eCLy40gb]ǣbWcwܜvT%uqF]w uZw慍ԁOI2~^#϶vQh .,O~;J q_.^t’YC\~zixr5痸 ]T`UÚ?[f{q='߿o=}soO|}ͷyJ:FC ߯CVCx桝P-z>[+E[^3nDjx:!u1oUNic-~~@?^ kr Tht:L,E@M <8-ܒ'׬]5x YdbuwLGö˝6%kTEb0SXRhb?I$ 8k%0r&BS#LĔE0?=@5)'}XPȆʠWl#A Qk>FLFd]sV Z ,vr҆ѧ5]F5lI uX[#Nb`ͪ57?#.>yiѰz̺sWѨ }V7;y_ܩΤ :3eA)vQDvHtM6(a&:BlJS#}}H= kh}4wEKsI|Vqčͥj_8#'jܬ9;Jq@j&)=4N*uU}&>eg@b) dX,Ju״0))5AD!J ^ҔT#8(Jn#Q(tD, ֽX|1;l&fvݿτVw3 =_zLN/cߒ|$ {DeXOMnu&0i3 !`)JXfd-i (cց*@ bO- [|rٕ'YSX[Vsg:iɮqx;i~PfH׍A@ bWdͥ X0?@w\|\2^ft^JCdÐ}|j>?zsg 3q6ifQ-YFWhwu_/{?>#1_ΏbŃZps~1MgެŬmnN7,kcr|1Yq24]T^ހ flwwfUK9oz3m} cz{ @Y :ksn)7RM^*X/$!ή|s^ɪb 8S9^\o B?RΣ#kFW[\{8X?0f>nVkm K?*.m~;^{4G E킁Jʻ'ôTݴ旙{+ߊ?<5s0wYȮ3cAe4@!&$2 MpڀY02,4QC:w弪K4~\nH;k9BQʿ.ިR 'INg*ڊP7cd jG@"N -:•c;O)XZR\P?{DZB xľ_Crp !UfD \Rf-GdgꪷzNEk@nSVYrThV^T~tΖVi4Jqwg- Mi ʝ֢kQ]OO/fT|wkKJ;߭RxSU &K3mDۤIM#$\KSmLb2cRN›1-.ot.GSD pzGמ`-̣h\iuKephB*=6lesBM;JJ1T0B&?`w!0ИUV{mv ur>:x! @QDA4mNRiכ,HUt:fxyF9Yr67ۀ97 iVE8USI{z hH%%Ɋ] sr=ʜ%xu}d S]RwLMZ%9įZ)|0 ڈ*Nqb>I'yRt0dA%pMNԵHF>{/.խ Ht=+ )D($JkCvYvӌT"yˌOh|z "[ݫ N6h < u\vN튃^qBI @d8JI|4VX4Ne,؎~B܂n]ΗZtj!:Hyou/c5n$MQ 3 o b9:pĥq@xc#TYiU2Mt%ChRU! 0阊'8$;k=/3(39՜ $rAV 2*e(6!+2]ni4/!!dZbM=A!%xB>hAjy1i"!%:]%|̡cLBP=n;H HjLEQ{pYf}Jh v9i @(L>^!B XC-D;f broۦVE̓ Aa8WidEȚ\\ۍEQpILMF"h4$?yP*ZE Q8TDYUi;=ƢPaRB]$1dlDH!(rĪMPk>Zl?c%NڳF& h%PD[v҆Zl+zVECU} MZ6H*tA6 W|ZR0 z0dkRzv+kF>,_OǴc$YMXFP]!lJ''h?$timl% f owGSp$JHu6tk*%8O)'ޮ Ę$ϷrR{R@k80)QC^":$5\Qr#bsK;I'%\IbLՃ J`Q Q%fdiOdPJ;HqUߴYaKZ*'˅[W6btejMA53"O|W F-LB)* j#XQTbwv{i0A!ZsO+IUE :ƀ6UqZ,vFO -jITAlRԞi'FLc &#m|aZv!; ZuZ"Oz*DEKcD=xZScUѫ98Enq Vp*FX6i =CL iDF:̈rk*夵m!X7Ϧh/)r7rid"&|9\jҪ\ ~IF i{BN Zf LBFmT'3tECPޙGD'gAA(zՋ^awq|~~c0#c`K :jCNn/_Dq*Pd{5(1 :8dw +R,H H E " !m7$gOAK$ЗHE!bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &b蟗B=>@T!"=OJI /!=@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L}$1G$W"sC֢|x$Yi$@_" d2I &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &\Xm2!F@0ؽ!`Ϟ2 %@uYdI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI/j?jW:WMovbA_o v\^{q~|Bi1 4yawsA&8bV_G[A:? \G=LK*b~ҎV综4, |;oݛXsEH-mX9+43,{[;+[O L8nt+omvD?4i`"s<,[xz>^`]~8ż;.v: ?Ϲz~$?}"B`R,qɸ0XӢ.x+SME73$\4v{+o:Bzllw'I}voB|>edsS}p=c\+ж]55-5C AAGzIA|mBGp}b@)кUyNH(썑qHQi'1v}]N K9o)֝HMb!V>Jee9cI̓n푵qJG|AϛhU#Grontzd\W:IɓGH݆CM[~rCF>4hg]2F V 佔bB(yd(0~_}?OZnƍ,%-JeJ΅+Q&f]ѩI{eb(cE?])l^yu3ấw^n8v8Tg/9{?P^}e{\^_k-|7΄k~: UqIAժJ~ux'Wn\1_qs#]|T; /Ǝڛvt9'un1Fh^m}ޢzTXi97K;%,#{,rtwHwlAËǎ:m}:;#8^/Nˇ_JD &&w 96ٱ/3x;?]ˍ0zkeFW?߿}qhcBv"Sƈ ].LlRC=ErIJhe%9[mʧtA0·_Q?cRY+MZ\˳1_>9eyhź6w<O1g'V۴U^6 zm`Em}},Fy ЄoK^⸮aF$ _LDlrA-,L!q7zz֗Bv˰\c@YMsM2VxrId +F!J\6NRhssosw |:GUX ^}\&kܓ1_h>Up5{en m-CCp1ǕdݟutF=ik !xK[%ge"U8M;nyl/'Ҵix.1JXUJlQ(u9u(wC1:6ڠFB:)Ƽ\ Z)exrj4/~I;y5J #bn.qXM;?1."W} S@?4;.KZ{XZ9pɸ:<>yRpyZKmV͏4h A1@%Bx'^ݪ./bNɟ//IV]hh:%f-F{][;][Djtk6;_?ֽU| ,p]Lfq_cmd#d~[-9jIi[vCEM쪇OKrhf߲^>_\Em8P|4{jE\H"o}i֊sZB Z"3-[KioyqwWoٖHpy4p[7s !Qt5o_}sד;J?Dtɰ; N];o.,oIcOZ,XVxe5ѳ{/'+w]=d]W,iZȢϻ_&vjF );:5pýkiE|7,L/o&T/`ɢd*8??MMl]lC2*T;Uc /II}Û_~|os0כo?{@ի!{!|(v|u~nkݚ5v)Q멗~}92ڱo-Fun'm6&$N2$20NT,r'2qksY*g[zm2v8iSt:,]'_~sWDu_qP8պ /wfuxxgx}d;{<똃)c@Fd,+Pˈ!pi|+N}Y_]S|Hm{W_C55= j[ Jgwg6opdquCB讈NAhK?;8h6Ҫ ҭn- ekQݭlBw)BwPz8C"QxPQWiLӳ=w9 eg,ʁ!FklT:,tD>)H] lv5nW8UWu:fNQۤkO VӭT&2n_Zo\b> F4̩]Y ?ӊSK@^ InQ~{{zwєos+-T~%p. P׳ G1/Ⱥl/:Br!Pl&6mN M l)~pV&oFףּ\mYh[սL뉘RgZcKC~ >Q7{Xi<8~3.Utw 3pR􂣏yoCf1Z9]΢#"AٕՆs}!Awv&&n*JC?wW-̄嬟!%q^ͦgiX%GtzF~VV'۫S{zΩkD֬1&!㙗Pveqk۟clCYz7XNˢF`lF 9BYBGJbnK<Od.u拔ݘa ֲ[5߰9tcE;\NGg&;[eh^WD_\$2L.3 %~|5߸|a.M]mqlx>ow̎ I~i8m'lOf+:~ [Vtu .uN9ل4h.]1vw]zhHzCr*v_PZ_Z7ֻtzfOͶCjPf)wkzΗr k\=.ۻ6fSיQuiqVyp7+꽁O ֬SHJ9d{!lm'd.(bZ ^e+5޲W&!.$-c@Cs)\Пj~V E^%֗a^lSoHl-Ԯ;*❂U9$8J muCUjdM»b̠$%e5殪< a\i*hdx-`OYةxNyor۱P 'nDA &im KCZOQ'dbe=6%UUA˧_RǬT62*ļUN9˥v]Z\4%g6^jS/~bG9,3ɸց@%1 tMdDhߓ刉s ZVNIYo'=-;$a֥M*#H"AںfV"0)M.G8$j2B):>dfde&j%v>RY["LgKu+X]m<"4ޞƟT|aP:%X+U*?\BD\#DVӞ@0gzSk2vGv QMB[Y;-Ѻ% p&FMu| HoD 8Y$&OlN=sqA2\ YWDd>%i|TlbFyLmfIyZ64"u*6G~eQKceIdZic ):*C$ kq^+ d T0*eB4I:R̪dK$QXe2=՞Je9B!cUEM_XsEmH2ϒO4B<$II4DcW^hfBeE6"xSiz;jrrqJoRZg=`ˉ~"/edqJIb5YPB "1LINEFet1{٬ʘAO &cK,N?r@62V~dUaa58 %XX:<*̸c}qlٗYfݓhиhy4_:b )rP2!>qn4J @((%NȤ2Dcdi"0 lA D`d1c3U9̂6p#㚉y2]K:Em^y=8 WLL6ieI2!T0g%8Hocdx[,db\Mф51sDtIqL%|2@FuҮs֨/L1`<D&""TFDGĝ6,:x+ J R V1ȭ,gpƔd'Ig x2H*)1Ȓi.P؊jُjO8[9%_g5)92.b=.<2^0)Jd f%Cr3 ɂLmČAx*xXM:CQqxVQ5vяOhz7ˊ ~*=f8HHt2FZ؇=>PyB+W:hSZoޅr:3,·"cTJ D*3rcrHE*s `&2 -|ɛ~Bs&Ra $m] fs1/XP!Vpsr˜-x/BV:@r"2mp&֐H%}Z2mo< #q~U݄<0jBYѧYn-lԴD0W0jB0F%Enq}fHtir5\Ld8']#r#׊qxJ *Tv2B\a$f ‚ĸ}SFLDFE+R_;pw酘gl)\*6XKJdɸ2IE; Wb\{vDΎxv־RqDD)[q+GpϑX/䣧OD1%+Rζ~|@>I~H&|c2@653)YG'o ^ɠIXstN!{aRy:بA yU~c*atBUMn58v4xl6G%(l /E_\d& )*r!K-5LUԤW*ʻdo/(Z>MXox4o G~7Od+8}G㙶=h3_lc76#T.]LVE% 鷗sPߴ͌A3`{Ց[kD4E4r=ᗐegɮGoS:+ n:#y>nf>dz-@VB$Nh!2L̓.x.#5!I`dEoaDd "pcY&.)\”miByzGZGGJO7޸Z^1p_UGA-vwu6 "c,@hMyQ `z\nSXs<{wLLhP @l Z6FEAzFШW&!¨K2&+za-:dvSp`F=m ?=>x/d? WP /Q#H0Mě,8lD,~L%.*EY_t-lzB>_g,s@ܑ̙*Մ\Gb<W9E0}.en6*V:%}kf ]]Em,7i2%m@uIhޞ"ˍb@rdWlTbW;dt|yՌc}:k~7r(MWْ=` `R`MM_>(HA~pXI G6zRtG5M*aL;9"pyW fUB{֔ ڀ ivZTYE6CᄄTnsm-YTQ2ښp/R;_62+F33N~/-Q@0N<~q3OڿJhw ;flMh ;Zu*KF{ʼzX6n_sڤ|dԾu'ڲ0|'l&m9~2_&ze4C{uGE þ)k_˚ٟчq|E}8qYGQ(9\%Ɉh2 i24g0W?ˋ@@쟽e0.=)gSHio]V>%3N3!)AQMtV׋+ЧzCɹ2>r.rG\xeIܛq`{ {:!Udz2]X !{^O//}F ;W'i Q^kjE4[QSL;_Kq.3I.{s7"\?g6~s6;3lb5gOgqTM~Bbq=߽wM|n~}?:VnmR7Î:S9|Vl֠|YFܹHD4/Im8.Ñ8@!~h.ʚjjnu& ;D^ѣlv7bJֳh" 0}&!y2(~C5^ Zҝo >Jb9m_=QN7}zdox&b8;j-saAD\ ujpN9NQQ42I :zsp1٘ PRj"bAn DL!L'4G)3F+Bq1b\q/Mս'=OT> a]翏r|YG3Cqrö`/P*͖ͦax y\M({;'EsB#ݕFS5/ljhyr`^ 6GNn=i9w 'k:uK@G{E }MǰE'NL[}jr5Y};]M> Re6uxmGKQ ^yQϾmjoO~iU QOwd{?Towo^E:GUwgԜebJ=ÿ̮ACKj˥yNɔ)ӹt>PW.g6}If<66:H_Hiy|~b]+܇ΦgmMnm.dxS i0jqG"/P 4.@DpK(^Drڡ Dldhښ"8T0ThA)J̢P%Ol[s[q x:&VϞ|34aS4#w4NC5DnY6`2z̍g>2Oz$R d͍NԨLqń,&d1!Ԅp&2fBPMpZ/L Cm, e Wj&^ Dq.B>4Ѐ刂T[i%1ț)Wv5g,'^~sz~kn1dy.&džg>) }{C{ALOWlc@q)-9a:7G/cZQ$S<(^Vp8O^ey\x:'BzK@rP ZZ+\kkpH *\k@µV ZZ+\k_aQ *\kkpµV Kz(e oEtT.KGފ[Qx+ oE(X^J\"٤$ڔofќda?@ G^,BQh? G(V@8Ѓ$ @Y(~ڏBQh? GBQh? G1bt)@ӽ]m,*#12 xXTC8=j|L RNhф"q '!EPK!0BACS^jV䋼63$'ud Bb(U4r![stYŠ /-ﰥrCơbڒ׾Catw9E`_^Pm@'=X`aPELΦ8ьJ"IǤ&/OVrQE0RP2<8n JIL`!DxrfILQFep1QMR0=MBԣG4åXzޚ5r fOZ1din0(,IW$8#-!]GAG$MMϱ_|2Ն |dm<щBlS  c5vKKOC'ƈ&@fa(edOiVO XL2&ZߏC~) {Tհx\_hÍ!1W7r3f$3#h d @r9:U1`N& &STGqho-IWVƁDg#1 ;lndtzWQ5lyc؆*[CC.lHQ‘v񿶼Fx_ڏoV(ᢖ,v{n!dZڷٽmTΟ| =Y98L'VJuɮZ@U2+|_\l+U |n+e ie}Q`PgJyobvQ+|N]8T-6[]Rwq0? o}6~?N7|s׸ e}Ncnvn¯;5@ET[T4jChn|zzY/U<4*D#tcͧPabqton;ND|v9D@(fpxb9FbkRޭ%d/pL4{I;n,(ڎ'I^H*$Ֆш 5mN4yBPVʹN/:ۚx!ۜ<0^\U%tiBwP7 ػ6r,W݊|5,6gtwik"KnIvYqILɲ:QT"Ͻu>>ǥXG4_>He$OS*F,i\R㕎كeDƕN_߈w1J&ijb (w[ %QMn,SX9{g#iTnbCc$}J~s@za|Ow-LyF}\sIy0bϙq`˼_f?i0{w{5!~n"N M4+Cz3hZstM޼[f>{!pLM&ȐC&9t${lic5(/-~>漏|],db !Y"3OCE+Fۼ;=3}4afdwHD4LZfnq gN_펝a?h/s{m d׺y} Cdo4ݽA;2/`ϪRg.ȭMlFsk鵽\Gw}zidn ntt3+zdG2f["|_Ug)٨y=lw׷=2cyķ>V(9鎊9gixz}-A(ˡxE}]Y9evď0숧ȑ>C6%z_hPY7{ R ;Khi#9R7XG>A&A}RrHlx&"DY.'DJbtJ˸q.@N%eLB$x˜S ˊ5g @7;L0I]'O=\Uzyzzq{|T2S㻖6X>Uα$ NQOcB7.HZʤ+6Y:<yP[ʳND0xJh 1VpIgQQK*^^j@%SwiKv@P!~>X?>6}=9ǷLG>$L>ddC&D=R?Y8$}E'jLg@2>9~42O}D1.h9('j!}5os8UltkPMDGr / ޡajӂZ )Q5!"sbD".$G:*[:T֨Ϣ|~CFDV"Bƫ 2Rgyg'gRXsQ娀|N/2 Ś{/Y<1ޜսɜgLB;ufR7ySʋlT7mbn ٮi*Wl8Юj|#4-КZ7D4.j8 SV [o@On=[g8HY% x4H Ĩ 1NQs ٜcpp c%zl_1v -`t[Lƅ8?$yJ8nR13rح;;ʎ^ʜʳ߾DU )f)6)!ia!p62^|"Xc`=(SAy^I‚Ҏ0\eThZؠ-p9/&kJ ]ݵ8ƮE(܁'1^ O iW2vObKhY'X$0AsQ`h1s,EkZHqXEE B+0T`80ht0J tƃAD6B0Q I/< 0 Ѳ{EF.sBzFN_#*Wxm-v`[ovvO <g[yuV)(<-s )#1E.թOסBO'Tys9x/BS'BP4 "<КD-!yJqSǀzgIM "mBsF2\2Bfʱ^'}dݤsv_Q8d2QO"E-EY/<煾/̳οqY9FJi*$h3oۈ,~M q OUK ,na=tO7V܋9 H=Ն@_,T0TdQTr 5}笠QxP;߫vxvF_/cjd>a;Mn$Fsʙ"wG}UY0Bkk|Cz*`gCu*_7W7pC_ﲑX0V9E=C،R12{XimDqb$O8;#vr?9J?x%-~eM24wo #7anni0m6׎ņun9ޓ.?Gǭy#7iXSy{z2,sd6LL"=eˍmO;e.kfnD ?Ghz3Mq[GYI1sKIK&Y%;@#R`ɰ~EyZ1E3o/~&Z@ox;qxRPIsKj>Q294 /qv{">$*Dge"GOW$Nn%aDړgUoSifO_ܴ!u4Urc#u&R7-og05CyaQȴ`y`I,)\ OGF$ A8C6Ce`!wl˼1ba^Sm- ߜisGY'4HZ@nfCI 8qJ<|PB1#W2(N"wވiz9oi0- b7ԡվ)iߨ=INJj0joZf@SG;^%|l|\rb7M.&0OUx?{A뵲r]ZF l_#g=sgQsӗρ|vd[O?⻋}_.K>3x?W0%8O5R\6%A'H}D(Lp7$Rώ ^ TT*VEªHX "aU$Z@U$U U$U*VEªHX "ayHX "aU$U*VEªHX "aU$U*VEªHX "aU$U*VEªHXT*VEªHX "aU$U*VEªHرѫU$U*VEªHX "aU$U*FO9'KLFwc!$HLǷTA8;$Y(ךY +;G|>dXp;brRP9#9GؒG%8YZܐ" tAb"&"D1AxdC!2dO ތk1Ru\ń᧻Ah`?>]ao ?5vrsۇ^ˮW?{7;8}v8s{0Ξrǟr0%L§AaPkًA֖ "u)(dj{Jߌt4N`4v Bd0yYO'=q`*zDQdNkmFp_vfld0 ,9z؞lM۲ nT=~U,V B!18%s)tM )d2%cNt٤l{M[+֖(F_lI^G[ԝNΏl4%AvYE+ʪn0NX/7^B)Zi!΁p (L,ӥHkN`F:뒆CL[9nn&\-ԍ-dݵZ-~z :fI]m_>y=|2v䧟`<&R11SwD|ىĐ6P0WbL碥}0PD/B WKeYJ!Xځ >d"O: +I@n"&ݍҶеN(u|qjSiQ=<ӥjc,A  hP"'[ySQ6(m,,mJ["Ā.1Ȑ R%fY((9Ht`*$Bc nF|#zr|6-ʖ$^|=|nO;gLܳU1ۣ|qvq6O-$,$ׁcz 9 .D6bgŜ1/(`($jBXb  $$-!I|쬱kSL0UsVܧy @KG^/ȹ89]+sn˷;_n@o0׏?>`$WÀW jl'&lf`بd)DC.w!t>!XbAoǰI#" o_.{`m z&+{bkL<1 QR1s]@$@4>-ͣ2j<{UxTKAfIŃ֪](Z86Td:B`P#9BI6DɁg/N֓/A֚BBR^eRhM45Ey$ȹUG-õ Smkj,H` C)LACK@ /F\SL.( t(ͱ!vF1J3MMyuF:){(FObtJXE5)iCݜB%3yIcD(4I9槬1H^$fq~~ki;0 4Kx!V˨~rg5e>֛0O|Scя˯<԰\ _'I!F0G4q99@ %Eʹ?8Gi-5<|J O[]['r0Y0@B@-˝|3e7 ^1}˜L>T.E%%fu> )QM}c^VWOV͛˺/A2 R]g[q&$}R=~tVwۇ]ꅷW X/$6x/JYm/8L7;?Bl ֙ymӨ!NsYޱ <:Wbb9'-glUg]?Y=/gZ=y-#ubա笿ʨF͖|n͠=Fj~oL_s/֛"},/uɲM0M~ˈhvv5K!nҮ*nyTǜGOG4>=f!?÷oݷo>o7߾ |x?o?eFl}L[M`jԫa^)Z>y .祓ԫ` q? mۺ &gHw*'G_CM.!Z!bIyUr1jvkFA; ~RZ߉MaS_bQblŧB@D&fvƱP6i"i!Ȳ24٫fyMjHy+5hhi8My4?q+Q^t%T-xjksEtZؙÐJb()CyWՅ:#m[6|Џ6a d1{/ H:U%Q!hv)mۂdrwaګ bo0z& 8Ӿ /.u'pʥ@? R"MvSCH~EKn=(oքd X7ϝTI/J*YXFY(-)i4rv(Ab)c#JcTEH] %ԁaUG㚑 ǖчh =)tR y5)(-ރCII#dr|&VWC~8(ѻ6ϳ1¦g>OG 8OU;鰆\kӣBpYt.ЙEWyZ"kro~އГ/|7?G?0y~?Z-˲8ƪ(ssbOL2iC ~ɣâp~I Lh* qtKIrTB(uUxăt(E7e]ⵢ#F'',㞞7=Nc(w6b0-mp >wEvsu8zNh8؜Y޾usu6_G~}7|V|O^{w+Kni3QY;}E.DJڼ6=7 Pݚۇ~阿\Yۇtsun]~u>mȗ7?ȵj㭯ޞp̻<+ݒ).w j |ٲQ] k}뮉$);~4y)h/C/Jf=SJmi|fM꾵2 mMTVdbCӁͲde6/h{_{c'WV'/P񬀈Y e k8%P3 @(%iMSZ@CNX})7cI Q"bw^Gۚswk~]+ά:jxu1@gH I8YEUqԩJ9*Ԕ9eNHeu~}C-Q2؄Ju&d.v%@ {_{G0IO^QJr`ċֳ?2XMmum-% i*mg#<٣t,)*Md!J[ Zej,١Ye3?܇R^ǥz|gq?מkJkrOTSт 9˨FGAF2,BP@R@MDV!鶚с/)HEKŪ'X FBNP >'FT:Ԋ顰jsflUfqO]hE[]hŠ U^KZWd9gZ{㺑_19=!YU|&6v1>Mlɐl'=^]Il. 0[꺼U!ꛗg˃&|yˣ7_:bk}Q{$\ktkc pB}@"F2K5ú'\T)rsJq&ls܂K- h;p:'?j¹q$1 jEm2j[Eme7;\TT&[ z! ahbUJj@UplSb 2T4 \] }nahriC=YZ-p+wGO`|,Z~ʈo4ZO}5 rs9cG1rJ\%xsvUDLCwAქ؍31r@= %8P+u퍠V vD헎[]gis,/.ʸH7F:C97PgD(>Ϥ ȋD C9).>.><\-;2 lEx E9inZ?dp'nͫ[7neuRhnX7sdq# 6Zئhm"mQj)mk|sZrݼ>=Y?O}\>ߎׅL< r<@útHf|t;..-e<]'҂}hW=OE(ByR,H> < K#9/١F Vls]{¹Yй=yV%N9_-;[ؾ氤In/rRYnEgҮڇ;p(￰XgqXܮVƸa6)=M,NǜuIK~rQW:pek'j* 7Ά#;H#\qݐԄ ɺ:03s }!ml &OOoj3bsTvyQH?gG=9󟻾4)Rp y~!6 /o t)h[l82s~]/{im?w~ڮ=|F}[$ypɳݔ[6hwd /k~]`|-|2;-7suݸSJ}َzvGIH-mX_{2BV/l}yM1n78w{=&hOGgh%O ?.tqȶ-qSk!ܙF$g6IQt\ɅmdZ*bmD&7Iȥ4W\7;Yl>~ ;ۊ#\Xs`'F"5]ckFJCzj)8annHGVII[ݘjEHr3np[d38ƆJucݟ+Wcl|h>"[2"9QMpɶ$4⌱ـk  bG&אegKU@J%B1qQUBèi㚽TLcG&hjd!8|DH2 گ뗹lbZWuTo:dyJR8U ?2P9};n.`gUUL/k>(6fp͠ 7-[cQG틳)5Yݜ|O"w͝8ǞQhhM?cCS hld`/$ڙFs`ȐT X%|i$%x1XԓV틕jȦm3nBN# -UkP9<5.UG;Zd{|]Z};a:[gr@,x&TPDKFaHf z3bqGs`Qn<p-i ?ԗ% AӃ\_|C/\ a :ıe6jkNkY Q\nήQuS k;0N-id#]22+wl"2v)ِB&x $۴fk<۾6XK2Һzj`B]Ģv|ћAޠb;1d a[v`8`QjaP;&r ``anbZs5%;|P f9X`vDiX"\+thonBb8F@)- UTȮpõ!qݫ. Pݷ6~t0BK2[ h .RDA@B[3 Z.K'@IQ,KL3fMŘQSs3d (m0T4'_ )0qs/?,0Qσ"%!03 IMfkCВ}Y6>wy>=Ah-OlӞsSdyf@%` qbS@H3iCΦA]UKǰuflѲFh햡@Ƭʋ#fnH|UiF٥fÁvCQB^(I˩4ts ^!7*7S<V-JtܩSCAʸj*s)anVzS Dc?7گ @+M[n(\BQQW䍡"n|BܔE#!HYIG( YziVYu h5H?xImLq1c2=R ݵ'bx'j+Hݚ6 E܅0`-Pڀ@r"U 5],35H [΀~122\lpJ7a"EIvU$Flnyt' N4CD!c-ݱ%IqV|C(/C$0pfT BBք 9/}<2[P@\/ ?'kqɛKζ4'flaM4nowA})y+.iqRp(zumfOah@1'B0O 4 T@ƸQ:R'N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R'@[xJN O p{2N d ()kt9N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R'B-'u&|xs87 (S'=*H@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': :𪓧ڮ{:N<3-GJSĪ+t%R': N uH@R': N uH@R': Nc{7W!&,~?|?rn`g $1~ڊeJ+LݷEYcdX]UfUW g@}&P g@}&P g@}&P g@}&P g@}&P g@}&P dL.iv"^Wa5 BewU1ˮ2 D `lr5U Yi o+隌M)Wd?zΪ8dEoj;)7.(7/uLh@Yt`r3!Ξ12ӱ˫ȼ?%Е7r\MJ=?4g.;"/WSr;7P,M'C/.rЯqM/NOK& !+/ h<ڷ&6tQ,3oxTaA Ŝ9sX3uIZȠ$^ pfGhI"9E)"\fdΡ93L&TY1&9ONV -f6IQܦ98l1Oq56HdAAx-3|%gS'SIfhd1UvVd$펿wP\M^ǘ+3'Y~4 +>;#M'eL-f0f=_ItՋ#3Vo^nVW0=nގNﰌL<iײp/utxvgȖ3NmNLK٩_@hWN+?~W~[`0N"Ne aNE&e}Qauȁkl[g<S6-Fzq0ev2@v^ȴE3Q-a.-iσۻ,9 ljŲfu9%@ ;1aXPVt*Ӗz˨ h2 Z9,FV3)mP|b_TEjw٤\6 _ßPrH` <,3~Y*ƌa&1DP禎`nPO%ލ}.&l+Ch{WRZss̀Mh_)ZtlâJq5X*E0S[Qӫ.[K~HmQ\n쪨v 'sY:ڧm[I/w )B;+bڬeצ||r|InDT"R-=ZtԘWFR !qcV ^YYs6̬I)DWt~ƭ*WZV%؉RY+W4Un-OzUQ/};kW~^GWo٫bv.3;.V Jlޮ5\ gx#&ZڦiX^M}tᵙ4-Wftl|ZdL9n>md4!Q 8!lQ{pP)fPLBTb0-³X$ˍ7 /`k)8$c,6!hILBq*̴uj(֡a茝 Ca' wS$V,[Ob3ݼyN-)a:PrG1RP#JG,:){b 4X9!aN'1tcxٹMO-ϫ IJ@Hc|::l|s !D* 4L.xR}p#/"^jCow70fny?l{5*su3NC}^Rh^U8]FVJ,3ާq~- NSm.WCn WWcl6zx _kFwa]{Ȳ gK5"ǘJTDp Wڷ""=#6_FzMcYr}ܳgQ3C=qVp'AX7F"aL,X:@4D`4@3Xx҃Nieȫ}4_/ ܩj/?;"Ϸe,'B Jxdă Lb&\rI T /Ay(o^'mgV|͛/3kQllB ER#tq# 1񜁿kbt.Iyy=;GI>G vR9txiwF_r}Gyk4)`a cyD^*,Zr­^2VCQ{K!I%iㅑJm)9 EXM'2Opǀ3v:'Ha0<@CH)԰@4a%]^20(dܜc'b__eefd~񗯲r2β(R.׿z2?Q ExSEI~MBWozu.{ɛ7ٛ1̍J|2 ݧhἁox{D>δ<^ |l#< >Kj~/Kjˏ:ˉq ˯ލK`H}>%hI\Un:yhne{2c>g75:fd." **slM}l ܚ…*ٿSŸϦx5*όհ6ԩ1dt|s#aЈ0m߯0 Vyj҇yu/)ګ7ZnTW&~+0n v"L(B&hp#*,å}3p).a7iDHLTsy@̆)Ϊm@ɫta:6)>ͻr/'U]I^jUVZ2VIfol W5Aqeb]`?w[r4?~UY%5IWDՔ( 7ҡˉRf|{3?:y?IV"Vgx$d#ȱ"1hj=ˋ"ultl n<@Ox*̫׵۰uk6^$nxԿH|<l(׉T&0~zDmFi#,OCX^Srh˧O#];]1T)b+t@Hm'g)Oc.*5X@UIT{!x{w6ӛea77ʠې< ڥ̿U nsݻ|s-l;hƼ΀9ͨ\[ 8'ꁘeVDƥT Rݩ')Z-+ׄ<_kB^#RX4}6D:w揇Y27$ ~i"'JX?p2'ON>S>KNkϺOر966O&QqL3Yp^6yOHP9&rBQ( ",k<+kZ ug2B= MyxQŸ:ԗ }Q`t!O!;k|L:hF٠FBNVpv@^{-/vSyl@o}i0#0&l\1#^}&۰ "H_Fu,(.: 6wz\>XX =SUc=c@)cbO114X@&5$XiuHEȲޤ}طm+GL[~k7 (DE3.Ιdsh5(,1O^Pe?N䂆MpP\ndA\njZ^+qh`gr,zlafjQ#ȎĎC:3zTO<*-Q=h,3*ȂɁ](Qh*"Ԉ@$H,uёE^'QCpPŬۅx) Q?jGl8wjCA)Ѥ/#K]+:n*w?د7TϪhI(TZ(ve,|1RIʔ4҂2`.nf۝VRZ< XaHj_3)MւD'$++*D|RJ[QrTTZ1gEiȢ,,YY36fzl8Գ^Ԭ!AC)%RHbm-b7RHc{ SEq7 I#PN?+PpCEJf5*LJq "C0RZ5BvPֳ!?~aod&8"*ЙQ$Jx5H2z%ZQ G1f2o >4#V䬱.DX%gS]))W<5hM҇Fmkᱝ/sWE|Dʇorį7bƦhSdJ&I4dTf/.@,d5 N=v`PCmPA`ѷ1W?˓E܁E4xoz5x>MA:vr>2j$$'_{ \2%{@MUVgϿ}j^c./Gx跓kgU@vjPitG'*%*jF(|wym/ZAdQ1d6; 7]L]_d~JQvmϋr_]V]sۮ껻 [>ߣ_-yO_bWiM2xط}݅&N_1Hw2=I 7'PKCQIyR5G^VkD/|F=bHitf^[h޲؎H뙐0^<!/oN7 M~|N!MEDZFa$~+7=[hT|$QV$ aዂJ㼖FC69·6ƹC9E-TLbB,d+B$B0*q͆ ǔ Q SX\aqHA*R9{!'o?/08ףo'o{`TX[xy^My)&?Zv//癯gՎt\CZdR~ǯ Bn"jr<@OFqs'3qo'ӲU\;X j`.USi6IFпO={@+.tUˋ~xϮ|ZO}mjkoo{4=r.V|Zݛ{ErdݳkʿwJo;ȃC[Uҳ~; խ9tmͧk,u|ww_6wns=w:8}ZzfM'-_H-_8(y{۫>2~Wf};$l|O} c'zD1W|Iz6?:|0c"mJMnr`x߅F t*fI3;{g)jɐ | +&-X`_H%UtAy4փIYF[-M0&3yr{cv vt,L0ӟ &o(&brҡ6EX*XEfx(Ȼ]k d r*a-*шL0!!b1f\JSYQTɡ@̼tdHDcd, *RgeRQV<۟i< 4EҙVML,YDhȸO+TA) :#k.O.x`ho[ 3#W>C  όg i%4̥!u9%RϬo^GidXCNwh Q)T70fŒK4ds`Tr 3B324-.`=j TXJ KSP1R CNP8KejmH(Ĵ΢yS!mPOAg)[(}uLfgY.&k:&;W&Q緉:4v8Aci:$/EN% GFlc)޻ YTEp= jrե\=11Hx\5mF4I{SLF\E3FSC!E&]PcFaYl *݈P#+(DԢa+6=NU K?t`frMT#|GF'Oz>p|wB0:WLzF45jT&bʤ3 `tFz^ZFTb7^0%$LHh\*>rRtp91RRFel6-c;6Bmm>-qo}^{>Z]h=[ ]ΦKZ\\epde4 ͑d#kyUe Mv䴡-,OCuBbJV#I\a}a뭎Vùb<]s1c[IChlhG{HsQJFd̡Hi%G*\1%fZ[JACP^D(MaSD#4{Ȍ f/֤IfKc"jj}b:КpS_snӔaEl&EE-ZC\^cNZFjKIgHRB'yFɿHsN]kOnxV_d4`\%MWQFE x]Uq5"zJXl5gF),0iԖ0W8֜;p+Jb:GE9.x~EZ"VV5mLr K+KTɒavBCō1u7㡾_<|6דV77-V?>GC~\=rrδGXXuHzG ڸѤeӲZ`=n1J}\vRt<Öx?O_HM.V{]@e%p1Pt{h*7vG\V&5XZR(MwPV̥-\eBy7Lǰ]$lmPgP#yɑ0'!Spy ;>w4]WE>9nv~г:53+Etչ^k},.h֎%^C\o@:;daP^{"l#Th'h.bQذLx UsU@z&^DW`tk%RA\n88aĵ869NhLtd6)TʟnD8 :H)brpfՐޫeqn: hf1N ?}s$8ثn}ʓ9eᆅN-,2u/?ˮoO3/ot^n嶷g9~N rς H5jM.ckU|DM b(cѠnƣAe8*[.dv!kԅGm{j(ںG޿i $/N?*^п;2m5CU/ww&r,ijCg =lM&s,QYgUPQH}EL=AmPE Npd"qDuМcRgT%coc9wBqPãp?h\<|sjVa^.Ƴxes11ه#f\'j9u>z}e*5hrT)j{Lkh*&w϶|E F t`JvUaZ@*aACTvMˣbyS]5!.ߌrKI_8iV6}fJ9U!9(vgmz,')jE쏜E6hUp\wG#Y0MlZ.`AZ'Hh' LcX|$ޥ9HǠוu1Tuʶ>λ̦oM.fW*ZcB'$(P}Q>f뜊e䘬P2eF}I@CTR,F|&>jTͥH;tgTMs+F}5 W冥g,m|N’Theiaj cN>YǸFM =ekN;1,NQiISÞdo]?~o_[,Ol~uV{%I؞}?^k⠔Yyю~p߽r/^Ekɋ݋׫%uGBxKl i~/hB|xza8.Z1J ii>[iG!W :!oKp@3,T4dCu^YVǺe3waAlZt*OdvpKـN0K.̏"~X;BwNYsny7i~ݠ_w hXN:;+:$+w١woTޢWo`򦔂#UvX,a<8u]|k tؖGY]qW\V輦m#&hf (zaV =,)q !E*IJoV5E瓎Ruvhc+Î֜;n|p6i~㞂d9UHE//AAj+s. Bsb=)hcmP| ζZ,ֱ2-C۶<*Xrl ŸP9[hZyo _|0~&r6ډ%gAP,GlPY'fF"8رxF~}}4oE\fF&i":TD~؝Isa_:,Fktr|t0ϖ!W,kRud:.ui+җhR> |t74Lsg.M&Φw>6qSAz,}+Dk\K]w8'{5ϯ{gog5^S4had}svfP郺EBRpc* #A*ۇq?:iƶ`_FPN\bhR۳?/P>x/*Mkl Ѫ`JqLѴ*@qH5e]R6pɭZM L9Z`Gb~PV#I ӏ{Q4~75#!E;еI>r}En%J1&4<XSNlٻSlqd:x(}ֹO +eF&!/bZkq`i^J,Y͙|?5lK7ekm'HPj/W>_Z{[޼Vɜ=,S 2 LoR/ 7Tt B]:%Z=ҤA 1"q&PRʤc1ZZF|zۧ*֞]CSû?yl`7JE2rfRVcU}-F7B:9|t NU coI9{89ؚOգgǞMʡWL% \ѶY,κcidXt{׷^K)glLZOAKԾA6vҴYWj0x4E;`1^aAh>FU}QYSLUY`<qY46%m@T4BxBDz!h&*(֮) *lը9HjU2RP J:<9iJ\.;8,Dnj 5sdSwoYM9"U"F¬SH:1cRD1y,AJ&gygm~xJIz_81kW BNPbܧc.Qu*Z;P~X6Fz,4Ygg3 oi`N }cTO,>*Sơ;< ]>RVGW[ 9iG&( \K%R*FͤL> +As CoVqj mWhbk4I\(.z<Ţij:6 ZnGkث C!/UZ\2_6hZhȗ_%0J*E1A"xP|bPk3bIXeժJaSN`p 8mgoB'4[L<'|`TmnZңYP5EBORiXMbWdRN^\XgL,l`ݛ Uoʈz$H* Ӷy7AjDb%Ɛ]X:#]&?~/#? O-|PUkyURiv䲊SvN G.hhZAtrI&mgr6 ̦"@޵6ٿ"d6@Lvd,y$ίÖ[)[v_ݖ%佷TX ?d"\rHZ S!*tQ8Fk5x5Zx7gfde0V8h)lhzW\S 3dKuk地λNyDhnDnzГcQj"XʷC=0u&+aHTFH5*(Hd9 $AK8ԣz,XFWǢ$a+kד7(u=wDMI3oD,` 6F[><]R =ʱF)i"oisdqv(p,(1ڈVYPɁ+1\ˀD}2H8/Ziw eGwM79J4qtN33ZxRs$I,ぬߠ։+$8@&%Ó- :at(AF rdF+UhJݶ 1<& hzw6ܷ8a{wfu@.M 9h h}I:G`6eg1Gz9ſ;iy|ǒDS"i!1T#ZqD" Md&!7uTW^0%BZΔ\9e's-cma}~B=~WsdRo?ɬU=ęAw=,'i{7L[tۻr-?,"ѻw˫ou#'ʸ,N^hE.Z]^8vMk-!i|vGZɑco{ŽFn>ٚ(;t_<_-ػ{{AB6z&~˛;H?Ѝ4#I~MÈH7s_X>"4EwŲō܍ٿO\dqTMnVލ:)iBGϣq(]_OrxO fyuJVӖqn>~0?_l₾e)*Bzҕn_ʾQV7ݝ1KZՅLryj!t4mD+jrfo.F0k]&N!ʏ֫/X^C Ľsy`}KgHA6j"Qu<d [ $6 㽅~Q5mǴ^iYEWs_NGo)ؠʩ %'J.NuE#~9"L @>l<\D"f2V &3,lSUq&%VY Cܦcp[.e"r.{g%Jeo\TQ95I&qfT>8, 0,#* νaa!#q՛ 4_*/X0epLBONg{8,(" l9vZ⪦'8=-w.3zkdn21$`<.B3<4ױan_$}oˬY8b Ʉ\,ܧD9̂>/-HD&nG׳'\iycLvIv4ޜBԒ?͘2F1v=zJ+I< {TTt;!fC2%]ӈ]OW \wvjצr ;ް9Ӣ-pu}_}esu4Y`^W?ߣålȕs;˂+77ҮKڳ{ʿ.;ɭSHu͂f{ץOH6ӿŶ nu{u3k{ɎNmvz7eC-Uwww^Pb{|vy(UO7t{5#S;:+t/ʾW%c)_47d}ԦYmEk?+ZEkg֨n}DK??:QWY?W/n#qPywd*UB(;ZςV )Ĝ8Ho)aiM1{Hh 021[/KsΤH \'2Tmd6XTjH,T.*aAp%zz]zN|c{͟5h`0 WX~7/d "<\cM{y# olDPr]&uξ>Aq}꛶f"LK^k"eD":^yO%sYkQntLY#h@'$WE5 m7]_&D|"kiTR̤D%[yϥڤI \@!A:`x>`@1Z%u|T6HK]de) &!/P(iR{:GZGGL; Gz3!&.s=:/ݭjZb^[l3O "3YcBٻ8$+ =.L;)Xx`XCc^# "oaf=Yd7%E,**2.d dAQSS.-hVݥ,Sxק#Blɮ3VlrBWUE%([*xDQJIYjA'͈_ p|W'uj^[fϭE啶C >])krפU~MK$Eh17K3Vhifw7ȅE|$O/No /t6j X)gmc(uSF|yդ߾idXg%k3cnzǑYnhŝj ny{, w:?>cqϿ}ڇ 5UtU<*;_Ȣ÷%}wcNꗶ;.;QjV7s w58` 72w̞1qƅ}b/^~rZL4wC;cR7:3uzxq?jp??RNs~; қw iyn[]pfr0sͪ;ƊsgH3lMgvϡ/V*4>]q/my[وƣEEC=懱`;*^9'#Wcs$ћv:OxG8lvw??:|ݹ_l–/¢-ۗ^{vvtr9/pwzևv9&ms?jxAۣk6>?7KMľW\4o/8ֱF,1 s$w#/TEUt=R;zcުn+ɌŢec)G6~yksz@C<)G;- ulR=rVzV| x~oӺ~F;?3L&z/+2;};E^LeaH %zʳe_0\MBg72glE`~lT J=>wM?m|os$BjöGӯjso^N:@}ڼz 9}>:~~T_UG[ةM} Ѹx~}nuy }JʩGazPCaD8fpfC41*M6GX`CM;+(s -^C>UQ X3߽=Eh^xO"b)^߽o77XAt;.ck˯nnpe&Y"ps3\H+Ao.+ۨPy^mfg#uKa|RT($;+v} E k.?kNy`.ȫ~wwɩqԩSo dX*r]X:|E4m)塇AsE|m}bzx-k߶ 8κ(t5F=Gʉr^i T|0{uS1YUS.\U\zGy(VQ4lMz.w7wnsِ KZ[m}7(NU͵`0zŦ nصUsϜ,'C&Z גMɺr9~v8K{_j66>dm-vܔ)!'Zvi 5x$3wlݗ󻚁v(U5=L&5%gnͤN3kD@۔R:WqehA+E4PιcPaټQ. k{I=|Q~DJ8Äk>6I%wǢ׿wo(m^()#3dR H$" zdUQ]0 ǛϣS mm"C@Ig6J&\u7:ђ$;)5b57Z򅉑hhMkc1 (6Zz_XrAUCK0ڨj]tT\C[ FTu"tvѾXl)qvQ]"'c)ƚV\fcQYc gݠJD;7 cm.5sBiN2[ X3 ɡR.iE)1ؽԀt@=1(`-)sQ #%ea@@x'PbՇp%.F(2 gːmVW;yX$Yqм:љ^\s!BC] +e6Mi٘ 膵y m "P!kG**2pٔ2g,΋(.ilP`+}6;6F&;T,n^; ,s&*8ЄWgA#NKbzD\Vߔfm4[@AQ8v6h&jKdE $ؕuùj@o]xb܀AS@lb4H,@>Gԝ/FJ75gQAgE΢E nB(J )XJ L t"% yc=N `ɦ p @] wf%ZMl!^B!E=TDNRP6ZjCB B 8FE[}/J "F_R0 =Yd^nW9@He!Z֠,DP(k4 e 1 ڰIZ񡀵+Z2 x&5aPcrnk3RUƌYw  (QťU5jBB%ϨWM@̰sgϝ-{ι9*6ԂGGo{x*Am:%T-do 0JP8|Pl;#똕dHW8NlQJ2TP hv=!#ȗ[XP茸PM"LyMA>D(ft4oei /Ĩ9ra@Vl-2[q;:Cp ]e,X]\N5~t}y!byUs& ld-Ł$ Sƚ&`SꝽ#Hh\Fe}c{#Ƞ1Ǡ]9{6^,s@ ѨT,)wK|ŔO1jvk)T2+Ex>ܖhviW,\9ZK) `e*;fQf-Ф7FQC,ϗ.P,tSjad LBFt޷{ OW,"U !@A){E]5B8ǻtOkČN}8R:o*#V-GO@5]hK[]no%hu[whjO*vz?yei߿.Ֆ5]p|w .{^o+!;SfMKnr[s> =ʀWjKrjOHWJmEp L"Dp%+\JW"Dp%+\JW"Dp%+\JW"Dp%+\JW"Dp%+\JW"Dp%+\JW*]A\tW0'#ѿjXHW_ \JW"Dp%+\JW"Dp%+\JW"Dp%+\JW"Dp%+\JW"Dp%+\JW"Dp%\9W08#>/rcFXrK\JW"Dp%+\JW"Dp%+\JW"Dp%+\JW"Dp%+\JW"Dp%+\JW"Dp%{ #c {tWdW6?{ƭOAbKy۠IMj%4 pZLr].w!ҝ'\PW#JJP!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\U!\UVULk 9L !,sZnK =x$+~1tEM^Pwpnpt%c8:Vg(԰Z~`j'RxOf̍<=X;4cKcmK{q`_z?~{4}_>io-*H@NP_2V4xATd0Y~a@hV$=ӟ+ b׀a㻽m%<Ʀl{գ0`tnd3?R;3{ݙAN Wxծh7b= 8Pg}`䚰XiT T6Sl9>Ġ|= k],ȗC޿'Jxõ@7óA37;CAGlK W#=ײU=>7JKzW`hP1LSꐣ>-GP5*b DQ֞SK Aj^-Yj}.-x#z -ۦy޶ȳ4wL3'bz֟䳸>d2~J}>%{b'kz. cv+(>ڜ|;/QlVȓhAx~Aʁ[Z: 9ҴHP}篚Z5ϴu^Y2G/8;_{wOP) > | ),;<9)1 4kL]YqI = WW)y%`fRu~]C\J0{++5JP 5EqJ1j-=XxP42uVԙ#MFԙMFך1 ͏aaqo:P#"qC'IS-BbzyAuDkmyΡ訋4Ao̓\hܬ,lC3\[&*pל[fOOf1-Kςm6QWe |6:9%ONƟOG.SL:9iR&SUԔQ=SՕ?{5;fxYB% PVcCHmBhZ]2Zl⬷J6[M3嵅m}ޕ-K69K`Hӛ),V?sڍ՝?-P)[[، (g.lHIC(#]ar"6 .#O Ҙ08aԏx!]c[H2[DR,biXWڥlZ]"-v4^Y~KY,,Q0)OxJf4%is?(FxvqWa6mgZ]4H3G] |8qFD;nW(Q]E?ξ͌VM8  a9L3ULY&TQǘؗ; {&\wPwͶzWl؀v8Z=1^q)?wOY9{+M&]/m`'ӟ\ƬV$VH~t ?z쵲 eL4;¼&BP54Rg͍Q)X:vFzڒy HnnumWӮuHw-6 HCT4<+ = I 3c1KEd#L<.7/K y67X op4f&PΖ'ݳg 0of1?`r} "WO,LLo|XKbMX}l`6?}CK/.h2r~:z8wp)89|~EucҀAޛ7vVm89#!9>_f右A/|tߌ.,Nyz^%F}ꥯ8OKyk?0~aޚ64뢆]?sTѸM!9a_ǣӥK3[|)0隷_^&%IֻL @{OɬDo=~XU; 蜴M:ϚMt)Z99sS-M,džv_\// W])/WVd.t|~vzbqf^*glF!.j)VQwFC\Y|apֈb.Uu4k2;qBX6ern##.y[m\](i6(e1XY^&4Y?.C Kp``BJz6bCn]N qH.R;ILn-eNK*Nuc 91A07۠|si)cPGՙH*7u!/ԝqe+wq}u5k#Tn}3[[* 꼋K`bz'``v;ˬ4*IuYN9tb+]tms|zY ՈqPwqO^:]']Q)y. +kXJDy;w]Fę\[]]tDV a 948(MiR2%': .ʻ?4x0rHRIpxa_Im Q!;Ne' T9Cڋd?HApw_a/5,5~TK\Q$!-k߯z%%YY=lQ3z|U]]a$HeWd @. -wC1h0Z)J^V'a8> 9j 50bQ UWWVxo4MU0t}of:ͫפ%o]zUTqZyᆱZ|ͫ;mv\꟯H^^]_ud4[ oX;I!aI"kڐd쥟} =&UOJKG| *vqWd6TO縺UUgTxVǩ"7ZɚjiE&j.jж'YIqG,fP}2#8c^+.uyX\*yGW;1p70 .зh, ?բŰꔐoTj)UnIxc 0Mat=hdd]pսoKG_7 qg1jyx&2w& &o7O_jW-?k’_ڌ{WKA< cGۓLp3)2-c^_?Рo~jgdVjj oҽJve&pD(9.|ldfma!!MOΩVdkFHLf( id#amElγA*[?18t~yw>ʉğm mw2uQW(ԲrkcB p[u x"(X &2!x,uQ%1d/ivjJ | P'$o8Adɱ:Fٝ 瑻8ǴarNQ'2sc[ITp3dƢt)dU:H'zMҢη(.lʺ>tjM<"G 0%f&f5ٓ,0 zxkO櫖0DQ&.+6\"/1 !6"KIEHf$/ˢ#y1k_qw.m~#e6ILk,EПiٔnaJl:l'd,PJT;4C_xkg˅zUm^UCTB ' RmN BlM KUp"UtZyZLK&@@бttP4o}`Xt;qI"B"HL cL92Q$6BJQ8e`q.+$l6fhdsgyCuW }}ҽ/Jp9LkaЭJ>/}>!uLF$8Kt8k A,]dT<\\ : m[GC_;i/a}N_sp1(!&Fkt",(K*!tT*sȜzӁ.8ׂޯ>AQ貫9>,=,[ɵm#4|^-QxqR5~fvI@۩Aߓ1>aqަ^?$'A/C3bD2u3ur2e.> 77 e9*mrUB 73a[fcٹW -UBcKiqyU[Q]- JVK.AqdS7Ya<=Aލؤ=T?\Ot6y?L$9E9heQ~earCP<.J{S{m A}":ռ@H8:Ak=tZ{}^7kM:x.U+msGҁ_2bu#AԁH5K%#/Ǘ^ Qx򂨮1͕' GTPFЫȳnC U[Bc4Qb{9!><^K욡[]T| I߉v3h:{HTPr+o>.K+sH9A!PmL[" !db89FǹvsRy<6QKOL#]j8sNl2}'6>[Q=dn unYٰϮOڵ|43&]+gumudq7!{7!Ԅ2Ȁ]H+Bdiv &-s< @Wr nϝ\[7$^2h맆W@c? `= ǫ/WLÛU$dH|5f -rIqk5Sn(oƛLí޹$K,B%K(xp!!Qg.CѰÌi'~:$k*9, ℔AIc(* Y͜k%B%:e茧bdr2y'-cZ,HJ& 3HgX9֌+ghb dRiIu ERG}@dN!Y#a鵐e,,GZ_Z`T1ĒSZD !$fjЉK&fHakkRd@Gڢ$ǘTs@U XΆ0*2!;1H 3|e``V_n(ŏrs(1pۣLINjC2 rA$DJD!dòlzs'<Dx8D\f’MLF'"D\^s̠J>D3f:3ϲFHKʀȭ2AhDV 8&u"%'VGc(wc>l8䳖~ج%}JţAqR0JL2k]RjĂ^TjcC9#AdSg+\_= ?&JP?BN0Ki<0nL$!ϐg AeM,9rBhCVYo',~;4c|!BBJ.9aAIJ lfuFǶu~?FM32IJ46 Il`4]U%f]Cs[wb}M  uF"Oe>1##6zA8:$>8eO<'@I6d8 /1\h 46f ;c!R6r)}~mi=rE QBMz!$yIӨcx$7j5rpL*9RӅ:M)vGm3Pݤא:Ea4 c@樌FEʫl,cCA#X$Q8(pã pd1b%s*- M)Ȅ5: K읍-] n{ߟⷧT?/~[WNsiP{sgx1sxղxt2$Q>Mܒ1P2LpV"S683ZJ.7 ^ի{V3(+GaȜ@|g,.f&dߵg xfv4 ӪF!'x]M}+HVD3zתV[]`(㪤G=E]jDںͫ_ BpJ;ٺC{?Յَ7dq? y4<;_έt7s !Ry1o ~Ǜ5MľCw`wO{Q*';&#&OZE3Xfhxtp~)f*_r_UrnzmJ>7\H|{2kk,_*>dmWVp^ q#qf|]o9WevqϢ]֓ 0&3fXYJr[Oے,Xq@VSd]E~U,~rOfg"M,WsnTnJEc4[*̫4UMqgd$W'?~U~~{}՛Sxzϓ7_Zi _t>X_$fF:MM 7moߣ]`mvyK?^oتp) Cll$亏G&'GoO$ bYV@Γ(9<[[jvD`tOj=ఘ\2p੯(dvξ"dsqHʎLj9J^E(ѓt:Tkk:LR_ﳻɆLx<~,󑮇 ʤ؟UfqFEZ~Qvku?N6Z@LҢo7b ୍a:Zl:~+TlBˬ/FF>U%>a5reH+BL 9ߦAnfA~۫߇-Ⱦ)[6@<%X>xַz(y/.G`0ncߖfm ڰ1.z"e#=ç[S:QM"8kT9ա\ʼnRHT:FHp9F7&!NdgEBNL+&3aYRd)LE[ƙ\JZe =M)z ༷ eYd9YrJѸi!G(q^7$F'>}p\ą1Ux!#u0e|[lQ&kytѧE'3ê96E_j8JWO//ލ]L Ӊ3W21Y' CZn|ኜbs?WI`Ə&m5nη~LS XOr̒6м%!UG'i`L~Iv$V5{5I~t3Lh/hp=G(eTƣV> 5(^Z&0A/QFj5M6}xIÁw]@;t]@;|}4eq;h"k 50p0{M0 7NXT&$e\袤\9eE(6 yEͪݐ7NjM;RC9y ,c*xX4^{1SRD峗ۡG`dA{cb ! 2uPcĈT0gl;ܚ8w54`B􊉜\f>ґQsTg Lyd4!XXmn8hCdAs%!)̉C %)''* DF.OM.x.F\ؚEz•Pq3]`isiB[Ƕz'CX@ѵVԨ#C@fnO c%vOY >Q::H4SJ%cK%ֆ"#V:0-޾irsM $+W^WOR\QYA$"% BӅ 8{l" %HXH ,s,cʦ,&T3.?wvPhf&##Y҂Ȭ lNCʊNg9[H؎e' \[ PhxrLX&t<+$'SOI0gEv֚8{ы$PTz|(%CK U?v2VF("m iI>Q1O? 9,3@l{FzdŭU:z ]<p J$R%RYuўy,!XGkr_-[j&"˃π)%e%LiDyf";ISu?=B⌌~~sׯo^oO_zs OOyrKZ 2MqlAh_'ܡiVޢi-mz>{ .oi[c3 aDN5a;IQIf+%ȀX(!$J<)2Zk-Qn-؄{[H2W^PKLZdE(u#Q=I3Nk)56^#fE%E0Ǜg\\?..@۹0OUlWy{kc b6FC˯ ua NM ^U-̵u`kg,SSAOco^<M@ʳ\`4t $5P.XuV <8m%Osn~I{![_? -V8.d8hqy٦tC(@8%3/3<AweA)[c@:)1J2#|@ސ-0+2$rddJplĭs/Q|xꞁ1XÔՎ ݒ0u.`c.&_v;<ի>ˌa/]wt,kz?&ƆZw܂*SI6*6\ ]TU2xnp\N݃{־wDE X DF9h<J(@"9,ۨU`;'hM dᓑQ3"2hA[v}1!=6ikU}4hq?0VtCփ[= :RH#ZGGq85/Jm[@`ٻ6n-  $ޱ:|dnz{w do ni%U#u{8zQh[q&$y9!?~<jKFCk@zV^H)Pv߯}I)<(´O (Df !1J5=ĤP9Xύ$g%rP}ψQ\ʾJ*|0;,&@'B3 :xҜsrq_ﳯچ뱃Oα$ N,D(އ#dk$<83u~`X'a;Ru2!hp@`NȝQ8ҥd \RR$T*nKEdQBbh18A :gDB0*QH} y bbd+d מpd9>Ό{&OIxP?dQq?]$}*kQ&Q# !i_(!Gp(ƁkDE/yPJ;dT8kHK?\U^bd'MV~ϟ6&.QJۍaM.;vEmZi=ݚ>qΜF ^p$QGJ\"j"Ht$Lܲn[eLN!#3hF!ք̉EKc̱u!A llĹ[~᪐q_XYaDd="5S*Díԫh'B,&GH! UNH$j#-FSBGgT%X8Pm"(Q{4[EgΗXL8vM=\Y슋0.{\ܚMMԃ"E";;p[q3@ 2R' }bcW<P솇a&ps]e3gy|{ :̺\oH֏s}ݛK9h~D"M.ֵ׎3[KJ_@ >~dȴay`I ;k T A֔&E78H9#RXVHX艸 U̙4gMv'<&ďQ! u{a" }=n'}r[zz1>'n;m?r~Z0vzc(7ӗ cI i<&q8qʱJc 5p/ G=0pI( $@IȌ;IxAqd#_H+6t=Bc5.G̯}L>Ts_Z۫uHg  <g[yyV)("y)s )#`L꣤OMǡNBKDd,'!.=uۉO qM&QKEH<`~nK?7##?ꝥ&&M4URE4BsJ2A!D5e܅~~w Mj1f?JLnHQi.  BK%@<yU8}0A'YFr BTADqLp#((GIF?wq)6\o]oKw ՛׭QܪUɞ/_·Asq+ىwWtex qk.o|7Ϣ//|e?NFӑ 2} Gog_Nuss0 gY{ӕJ奟+?aˍw}O+sqY%%f-&ӗB5Dzűχ'(vr.N{?jXkš lvr=Ѭ3gl4n|vh!yX6Ǐ{aۥy r[l%^ f"j`> ΟY;6ׁ-ٶAR*'n9+ݍ#t'hF~]`͖ϩorv͚9,xKVr[B,:MeχT6HH|lywL|D$tVnV״~w7 {2PoNpzR%HSY歋yu[V,.8E6oFkh| NZ~64s mN>ɱyȀ[A y|"u-z3-ڱذ20_k=4hyۻ-ʰLטfM,fKVrˍU|r)(jү *9%#:"5*%3Ns=xumo"{OGu^D*7%eycnꊢݱ'^؏mejt*wꦶl8`%lMLk^ K\c5Iϻ/gĖZm}o :SBƔ;)j*Q .ɰy|Z"y> ENdWG*R-<1gĀ.*?+Ҧ=Jc\ωR;J *fc`t|D3Ac:% "H x TSM77*iAhy{8lH (b5Y TqXM\(/pN]+:ͫ[krtٻex:] ;7Yt3zoҼNĻ##tI"!/ ^>fh|qbrl٥f]^7ow{HTsiqwf1v `hXox>RIU;M2勒ʶ=꽋>X*bcR~1 5F愶 rפQtR2LF zj-МޖK-԰uHkI,b>oYO`OFi3_b[_}ɹn)fٺ^|$R5nN}7ֿ%{Md-jd5Se\B*dB )#&(T QZ,$fTl@Y"b42^8.PS| ȂQUVXiȍ涔8w ӳ=Ϧcȑgf6AOcn6:#Uɤ"r5=_#8c4%ImR 2N]`s%jy7?'_2 oGD{v[(%ܭOuh7;Nˑ#e@DH)䔠SN.$xf Fؐ%B߷!S.e͋x 7u2µ, cJO<<SNi)d T"? ɘ~{ZOJ:.*X }.+]ti΄,tu; WF`:xP6̚Z&STz$%me$WwM"NIi=Kx$4Z#4l2 j0p^JF [ iVP*{XE~H3%)Ĉr؜h!$10Zڤ8B\3(9g̱GG-ǤSP ԆI'&0#2pLlqO?渪8 cW.qv~2?oƋjƣɴz"<VaT5ͿY'o=vg/g>˒<{V={8G$ǣ\Y: [Ϲѫ+a'ȍFXongevU6:g`4띮OҫW?.6c[ @UŮz ۠mqfŹ}l*QMrp`eB7Nji՜ƋKުbrC3R9w4A8 >c:TدdԹ0;|Y|{L6nlCafo`V?-(vSqUdf̎vKO,8-N!$*i\4STsoeA(N>#KbM}pa,vl3HY QM["WlEhR-ۀmݬ^` V(:@][/Jb.f sH@6 ~)\,gI^_V+.iI5mn7W|6)\ք/'ŝ`5P7?S"o=ЪŬ$*Խգ)xԋ؀#\z([9Lzy MŰRsӓ VjhcKZ*׽ħ 3=1 f*)Xp\%9oS [ -_{^ AlHNöZEܰu~{+G 7.-nki]3`hYԂE0>yrI"v 3A9j# 9 QEAT;AX^Zӱձ xWx׵x,&"O#ućI$JJ wV!LMONR!+,m'gYl!YGH(򹖜NAfx*ӨXVRh)ka :Y1F: lp6HF'g)Oc.*5X@UNvd+rj\h-BmJ;H"9%dYb68K@"QG }Y3ädP?LEuqX|q͊/^`~1Fb0Ћw0"cPVE649N_aE5ƩMˏ Yi#+-*Un>;jVLpZ*x}(#[,hhsW_.|&CuنlR&oʘt|UnhnӁDL&TYԁد.'n'AӥRn7K۴|m<>*[# 61̘^ɔ? wnt~> 6>+sI&_[Az]\p464Lgi{Y&8Vjltz+}*wA m] WOfp?qvg#1Yل<;lž͡C5ۘ{->MLMg8]ޅofoTG\9wa|%Sμ "ML}M{ٯuV~9vl>undp]0?e:xׄD9 /R= "FH =3"nI-tSɝqbM9ᓺ#]h: >]bbD׸DfK 9 LۅgT׈R?߲"3k {,neK{2՝>(XC ̊۱Ɋ|dThhP^"KT'i%f{)GƏ%Tj։# !+.^bL5 5 ] =XJE) Kd'M;(()>P8rµ[լjVw5]fuWYdkVsg= VZvː:At0%a %Lrv6ZP s@oR v"[cI"kC@@q7`@GvvitJ)Bd)*RhF= x"0 ¾A)Pʡ"VFudXD.t@)X+\$JKwA[SVˆ;:m㐶DEoГ'mXjuH!d`8Xp;oB! 2qV{tHb#EJ^)IBL 1m'l*8E;j?-K驢nho(|O[=ըBfQQQs>evZ|*>~lT{xD* _s, [:ٰ@/ bTMemOп3>T_bf(Z@,S&rΜ  Qp+l JFt4 $@ )FCxd!R,t,LD b FрG!eLD``k ,bXq _(8ӽA_BVOǓ7] 6vM3tsssQz >ᆪzz§eB_`ʸC\1#&:%:RE k@`7zf`j0#4-ֺq k6kG?rB=$ c?|w0`uSI ëWWވX> ɕDJ.?\ #u(&ʜ!r+ʥh#6t3;,D3@>wGGfVa2 6xļs$f|$n].>M=.~q={aαnk7 $ T@),yizE;CTEv;ˀ=?$^֝Sy[ju6ދL\@m\ [w~hfل;@9tnm[Xd&0 n%ŸAcn 93漌Ļ.ql ̏'1qP_ Ź NGdƑ@Cg0ƁEiH[/Ba-9IjwoFI* xa_Im Q!@?8v0)&8 n{N[iɶu"Ha0 _E:+96zkMz<l4 be9iyÏ78wea 7|n:ek) Nƅ;@_$L-qH=0D_3.dr-^g*/eյ2L38 }oD'MQOS+ZW)2 ?AoЅM"G/βB3:L]ܾ1Wo4hrF;X}mV}uۺ/[3_";- MSY+0n v" o!{P3i8gW Hy/|~p@.`,'LTtS<·踿1wEjr]) bFRr?@.QNa&(GP>x$d#ȱ"1hj=ˋVKu::6`-t+; m-H]"aqlV_[Jeq33q9fc,Xgm| 5UcIؗ#m^ij Ƒ#翢o *e~9_p@6=ȫM/֌(k$[tO]U|b)_{IQ2Ps)#Rfi h hl ۝zpctx&Νh[&M1:UޙvmQp9۠t6 α"Db~0m;_Plc㽅 'Qe@FXHlt`k@2qy-:{JSm?O)ƒ&q`\I-IU6cy%@YhSXQ_/|?Ai)Uocd!5 @g5EUvdj4tt0LFYshcu)-tR^+mXP {uVT`S*]h&о.@\hٴm$oOuI1Nڤ-0$,|j8O 9/^IXkoݺ6hAEwi6c\sH:f8EV%+v牯Wof)hh C 47=YfqkԎSN3Y+>,]~i{Z}<俋0_G~^xu6Epv0\-ɴMX- iyg__e"y}Hhx0f0a.*1ӓlVїDǜO;nmըݣorۨZ ^H؊tl-EdY⺋ \t'yqx{Wlfui n!s_, -8ܲ +2G'qWλ6U'[L1?o?}?{ӛwлݛ_Zy.sέD60"OyRQ!U$0s%j*%:Vk.)dXȓua\> @2kLIlL)@YHde'N8!N@R*Ile|^UϿ d\UcӤ5^Y~x_{K0=xZ_ B,u0*XS jxY6z];sh:DZgFJG]!G#Bq !Q\ 6:7&Jc> طqt&^Xh/Nf!-/q{v$?9pʕG}@F#YG7I,Y(4R5iS*TΕ_Ŷą\3Gfg!Et`. 5cJA#6U|P5Q`ɸѸntƱ@1a%/Q\HճV!5cрbޓ u*_!ӷ+˷_/<4|UYO>Ndiifgi^/P> ROmq:ljyE;y1:VK*X JWݲN#ɧKr>Hcl~0Nf`5-˖c/6G!:5 + ,Vha|o(}Vy~ |Y_<@6u'e& -&c!d-B_MG+rJ@t[9YUʝ/ ;_N/_SM mt~ex[d5 ~xl탯㷸Fot2q+7ݪ[:n\-/}Nح)ڬZ,>Fkv_]V1o?\UAٻ//9p97J9 (5_]wt勵wCkNz룻6:/vn}.yQjo}mIv8?[6 cSo[vd/aHEGh&*tz@z@,f Cf:M,ڍ Ca{T܏ ̬CGd}f.(Ӓ,:46ВBIyaULvر0ƛ iJNk*`sK("x"K!Q"M5iGQe G4Z<"sM]F* QٗTwL8w5q~K$r*,5)cƈ1T- ;q Td Vt_ BEL5_ٛd63-眡vgȈ tk8$e\ PC\Qk^+}kfZK&(+QNR9'k9FP*%\ !L(m1;J+}ЛۑZ{CW#Ro>YL7ث87cCBd}UHΩXг1:I ENaf`!g"]+[WZbL)he+!*qvuA̳gHJl0,!bdbxPQDl& DZQ;W\o(r7ŎF J"&l^&bٓsA8+r_3[%ءrx_"P#KslqJE,Tn .!x&jwt%8&>QO?FyjQ*}'XW`3PQ`nZmr`ZV'=-~7[8 ޷źj,F(?E6>Ě٫,A1)Qֶtqq{UN\!E|eZ xg)R m+ ),SD| @ה1S_DDdk5yuSu6uf-̃<5!XW+O` lEʢ!ɹͣylf>XG7 u h*_m߿) S67_@h (넄1䉒MR,\Mt6$S݇(FhU4TRɅ\2Lg&kmu^/d-<JP!ZP޵pc TkӮe^iw-.{o¹ރRTxH@6Zd| -=8&YԵ`2熄cHɶfȵvF%5oN,̶VDBufLb$ހ".Y2^QlĹ-}T`)^>䴛܋^F%'W]=e~œ^W9`-rRܪEF9+&*sFV1ZQI0RYzFGRSRboPΔRC2%rZ1z. Qz{nanfZZ5Gj2~}yef_fy~4>xttl=6շ^,ŜIZ##N[5Vو}hs!}?4&+^,FAJLs|voBUP j檲87{t2-(x^q_;{m=zL&`Ȩ2NN jkb].*N%gBI.uuYj-x598r2b+STR>~MԷ1GG4G=!c}%C פLJ&(#tt"iGlBu^Chj3Ih#!;vS¤ z^c6)uM[:ٺ~>:i}"t0/,ZA,ŘL z[27V$'. P(aT/?ul,^b'~rfUqbMNJY臈F3ih}_hNM/ƇmH9~^)9h;;:r̴=o/r?{%K\ b:"IWHs"CN3[p]9rS4G#O,NZT?69" 6buw qΦG FZ+!TSĪmrSy퐄m k^`r Z@\ zJdX1$0FzHiE.Lc687uN{ܬm)ONӖO:nMwl0^,6-#w}w?w]?4ON~0XJoyv e\UZñCbCrh,{f${1oN|gٙ Y *`* !h_|2kL% TBwN1 [ S+uS)*P2J޵F$ۿ|YBW0{-ˠa]!OK݆a}OVW?l\nc c ]]qDK'@R>0Ĺ|} Hw$m6oTcXPB;Z^7})!L")As5Q=J,(#d`jgoHPզḰ,r{3 k0؞C?bO%᷻+79nkr|GG3"%\cT.&ݝZ{f8OÇf"`)c3;g)YFr;ĵFnMLoҷtlw_ \@gŇ}rֽWHۋEQY13h%YO&)` K5Kə|j7UϮn,+69+Ƥ\$D>1 YCIJ [ r]"mDQH[B !ӓWk7Mu<ٯ;i-vŏq=nӽ௶8ꖁ'J"7VYcbLS)V%< n3SJx̚\u03ߤBy\,uɴO .˽@÷Giyl]=GkyW_/lo珿qχ.w[؎V?9A F/|ߖE~ˇ:Gj'f53Ddg{^x4X=n'Z=cy?Bt)NO>rK#s[h v48ۮlW 蝹=W.+;[ҼژX_oݣܛ3!er^ٓJ*9u7}:z3H?`ny :fRʈlԽg]No]G ί|?\fr䏽~drR7vA0!s|$JZT=ijPsʼnz|$I?f#ɻ7i!|s{|OW꩞Pчe˱8KpJ!TL X;ށ W50fm8y0ݸ Λۆ8:˳^dj:2게0pX-<Wg/O 'XJy? ߾cD'|=qo}_<}r=i K\{-)y^GdIHV2DvֺS?.Xx|~LP?2P0d"!}nW$]8O碑Wrǽ#-6,G7V=_\j%;//Xvhq=3?໇~zj;ZQrgv_g>AG&'Q <ut}z+Jd~0.Y?Ja EEǓ 9^!(nA\RHOez8sGr"^2 o2O^tj`غxedUi GG S:5ᬨ>OCHx )"$K3h*;A^]KM[94l2hO;X\j]XJ$V_G{Sn7r2h2$54nOi _RDZ=+uz2c՝"*ړ!MIFQv2/ONs3v˪d&s* E暠Qd{^|mAjNgb2G҈/ێ-beܷ fܻQ'ۊȏǓ5=>Y笸el=%A~f 7ESk' b4z߶}vq7[).{V̋O;OJS]]~;;65[%Fڌ;ⲧ wA9+իP9qBᷩ8i͋Šnz-9ހ#YeWa .{Aʖ̵Ƀup% yԪh빎K-kۍ??={aVvfp9uXK^yd񢓣lK0 +Y{$q#Jm Ɖq`8(%E)CIQw D˰}w56qi >+-9˙\[",rsB/ɐ":q)\} IT&+-J$}"aOJQ* 6Nb(*CddRِS>3VMB$ݟK"$`" .cCP眥FG"l]{M2^[4CɑV_ DSe Z !qk*Fbc6` BAW),2U"ֆѰ*%}*Ś]1F /ͥ;4@l!–V{(]=|#QIԴD hyJ!:KgRL$+Va9"s1Jqp%fE:ˊxhZwvH;@CV: .zS*h`:\yLBc5O0*!t^hMr Tx'"LD">I6 S= hXz A0ޠ𠬖{)-حx#(ng6 &E"32YЩ /XF| S,0mF4b$e9Aw4_XQ;FBɘWƆmXWS4 1w`T49Q1F܅Xe\YTA\JP+^ ,C>P6:="Z2zV* j?`P7*`k6%P!U4) KY(X(9hYܔ@ 1.XQwDPA&e#SPqVs o#:LYd5Qtta̚۸WXzJny$싪&\b\X-FKR^?pUIIYLcht\@~pJ!Q %Y9n)EF 3ax#>zP($B É`#A}A0i̐fu8!jQ0 Wo%6`JE"SY"I`(BOR/\K+`k5MF-!hPAa鹄 ZV;%r% ݝC@* (-1'0rVz5JJ0,5VAU|r2AVYX &bژ891pZ F$0ZenzݿSܜ~ɇ'-rzsޮ&] k}*V$EA7B^mL -{㳷9u- ashح''+dР9:3)d/LIKdI=%hA׃ʒU^ׯ@Gi52ޗxNSS{'Ңr6E}xx22ISQ(IDcr(Pr FExPtQc4 h0MZIY.l,uY(I.HQ Lm"DZ+i}>YGb6>-yFɇ]7x_nnoǭ><h#ə;mxqV뒢[O122&)ӄhIcxYDӳb!"1qzJ%eYN$>b*9qƐd&-O+.޿o?+ [Q!jM)|d!w)iRN1Ti4™P}^Z_/]}v[ ǡ8C)j;Bw% #^:pS-j4if3A3W,wSx>ua ,Ruܜ zɡ]=?ĔUo|m FǿXd7/^kY}0`<#ӟc^]1߿ϴܹ9 ߟwqzf]]\Ww_u~Np?ROG秋_!@|Z3&:?eoWcp :ϻʹțtqY}%F'F?n ~Kkm_.s߯kp.t?/_nI1XOЊT w3A75NtM9_|aM/]ַQnMqpr W 27E}R\8Ac̈́5%1\P6x !v_u{i\{bt9njՠ@ݵ_%wJN%6)j9^ά|Zo mw?x)q}'<~|&'D˽rRf(^/Ɩmߩ;Z k5Z[ͼ뎯77Kpjԍl{4DY{MIZ,0%TԔ`p=Lk^\ȳ k sJo;6zT.&V,|`B[nԔڬ3w͠Lܮ6w,ch&fTߢ=y̓[;N,L"!ژ5Y -"9d3upinҜr 7,!"$BZeo<@!Ĭ}I)M)4z!Fi\Ox wFoM7M-ѯmاK}S~~xv~۪37ZAlBڱ bVR7cߞOB (l|,SwHxo:{i୍c"h%q11_MU.9@3fk9K 1ڠ gC2~Sw:׿ /sݛͿ gȾlUꆞ=ޒԪʼ( rMɝT?ZvfL\ICmHh0^Kgv(E\*!R껗lཤa%z}yւpQU̘ܔ;*WlBdsAsei5 P)gx,e.<I|9}.%eKºI묭͂,2e Sk yJ۞%kMͳd7W@ڊ7{K}iMZ>Dzᤓ]pooh:V*N,ZCbr䵲&.CtR+e giF霩G#T[k49ŴR㾂ɦɦVIuY&?bϫk$W'{$WiCJ3\1K8y/6t4491l8O^p`\ޜgt{ՂRQXqEfJWސ1'E,!b>lw:di. &qYJB6A9[VRC{ƹ2iBЁ c6&WrJIֺRbUo.Bɩ lQ@ .d)VB &, 4QC$2zO3,JRe.qL28lz00֌ɠNq-܃Y_A?㼂|vy޿"5qOcG)s_#+%ȍyI}sX`яjkTDʶvqUHJmSD3ǯkƓޤJ KEk kE۲Nv z{rB]u!LF$1{v ]$) ^$RFlQG b}W%( e'ei&ɐ5v`m Q0j.*Aݬ%h9)/rNpz?zxk-x[QO cO޽Oc.*է^' q׈0lTDKGmtψ1XdЛ1Cx]Fx#"Q؋TxA%Jx}%J=$_}g>&Խwֹ ť626ХkZT[WBmhR.qZd]݉?(a;* %{R)>$1{DoɳK&z6aА" }}@Cvhfd)J&b:LQqVe}gENMWi@jg+Z /'5mae(1T Bz(7[]F5Gq04(jk4yԌOY\wʈyxzdAZ-?mQΰYUFYlZc|4=,M O_/Xg:m#hgv|=4FCrPlX2U!KUvm"<gjǚ׃4nLM0 Cѻ):b܌ ߑQzӌлnKݪфɈ Dpy6]0D ryt߄`Or;\ (,.a]o U槭v9b)S[8Wz͡Vnv vބ\aY3NM.2<+,ʪp> u1CHlbX yXQ]3:&ےĪK٣,E6=ȹmK+\f7G4;R%uqHc0J6dœp(U]  VD"X1I(!I)PkXHY_ϊr(6Kcݢr1F;b0~m"~֨0h6@t̂s7 +AL>kL69ּf=]ϐj[i޽ m-vo#7h#zmH\Ӷ!AZk5)V/0ݧk~L]6| -_Ƕa"A 3 /<( *DȘĞ]}@Bqߧ}wp[.|VIBNbw*,4N(, T0ɡs$-Zڐ5nlhVd<{@&A%|uHƈ@N*8Ճ Py?.5 iZlG4|ja)1YYJS+cMReL|+sА%R $gpw/_yQ1DslHM@LCӦ.H)gmI_AzH:JXj@)S҆P\D%3yY#D(4t)7nfui;4q%b5ūrX4iF£TjV7e!'-R }VcEP2d4i _1%O k77nMQRH-TfRi6cWpjs* IhYx c^Yx?j\Jr9jU8 9/Ħ@l3mFmqUfyɂi8t`]jGӫBs-glU/rӬUKYǵeNl}M 5R:?H qOtfR !.4Yyhtqi6:Zn˥:^`8=ܤ oyo?W/ū?}϶˄R-Q&G#5/~}_S|5Fقc^) ĭG` 0 Sg6$.$$x&W'dX1y9 eciFmDI:0W.m'=ŠÎW}GmJQd,> (NdffaQY$-YV:Txݝ/#gyыj4;lg|f+=ˏ+gzM}FGO뱈Vk[еri@ 11~_:jx4x]CF1 e>o$dEGFuf'h۹!BܷjU5[7qzzs}W j5vi!j)>g}=)C=1-Bm6N;1F Z bN1n; Q-LE\"g$h9QCR3"u2PyGzFN8d>DSp3iAxS}p$SPZCmы>B&!cvu[owh~OA yyyt>`SKӌֺ㋣EZN+TQ Utmw맗/BW| Bp$ZEhM񢭼ےȚB5:E@[;8q3[iϬf bUS1\Or:ķnf7w[Wp&NǬx<O;5'7n95[ geTYvvخ@ZN cY=1\v2u6 Jnٙ) ;ް9ơ{@jjwh46jqgMgx,Bks;x}V Y}aϡ0&3BfVi`}"EIY8lX0(<{Edt WZ0B ӟ":)(ѦȒ5{ @ʃ.b"5P3:XE@&6F$u BH%{EE@*5h g&ccʽTz7%P("&d QH-.J%depJ'{a²%q_&b225сlJ(dNfȸNBdÿ^_io\m3- b]{ y`3@ܶ>0D9ẕm@ZG(Tkne%&ZصFN ur$1JZ A ZJ`l!9r-#}ϡ7ہH6.Xg\Hk @"RhM4y- 7ꋜWա;iً;ePf^%,]GxAMALQ0KDFjSPՂ%zĎA)%ʙa6&Y킘_H&eP f!cŰAh:8I2۹drgEc J^iPHfd`jb {ڥzu\$-7r)g;W"b6/KQD)MOR ɹT ̈^[2Ԃ;B½)ow# |QJ^, h"*jVZJ9ZA49%{?inlX'x_k [@νgP2}Tx#ua]^I=&>7ko?ǹ+i]ـIB,tjqADGD؆2шNiCZ[Շ8';jWOY֞Usl? 5s72*Ӱpg숅BoXXxٻ_?eN??aN~?:<=]m=9FUźҬHVqyo|fߢFZ%}OCɢ*őI!jFjvFRwеuQsFrr\HgڳcWԶQQ{`o]8܉\pR6dK0Qc4oV+M+f,"V_R+SSo 2T4\] m2n(SBTgrfxǮ&##oo4֖(f5vrr ,ޱkR0% W&If"bkL mMҲG;+=~rҡCieɺx<Hk0~yC! DE rN.tRp:< 5+ɇo[4y՟vynqTΗ/E(d<ÏkRr3kr!Pu/"s)w('K $RgH3_;WFh$LJt sٗq ^q[DN5".O5ZmCn5GmZD*毷j|rd\SkJ6jc| 6GR~z|쨵B (d֗&2S)FK\) $ʱu~5X\`G >qznU6̹;/Xѻӱk꾂`qSmKܯtG[N>~,;Cw-Jf9?^e]F"u$gk͠2,_~|aᥰF7]xOo:/J(=o .f'E݃[6_Txk'"5 A•䛐e+2>B~~[r&ZTåJ2m";|%T%AjsH9lW"Ϯ$BܸҟY,U2nz>#n%Hr[=;"#n~iz|r5 : 4yW4/u^؅Wo|֘HS~c4j4NGoIsk&e |9kb(.$lR%{|VJUӽ/#+>l&Jb|vYЋkAnao7sazn}N8L>s&Ŕ{RL{b;;<Ē x'?.FSAiqv +8u:c6q5z3ba>-^Ճ&!>r!-HLiha~uX׋ut?j3l?]xY.V:J6dXo+<;pp7߾Okѵ6m ,T0d4lrV j˃߭?eZO#sny|z/-`#;ޏ_J{u2],/fS߿-hդE%-ʮ&QזpggC39#h?=q\ rF-A܂Eñ}K [|-^,1u 1)׾ er j_9jgoMOu n{?Wb‹߿_=HHmAq_Q2|1Qݲ=\GYZ чYT~;;[#Q0"jPk\YߝEf=a =z0fԥfgV[ORnO|-l-⳯oD^E>Z^:Yãb }Lxr{@*l}wBlL]);LExsY,oh2-NNWgc"Wَ]qlSf=<OrDo{Kt-Mí\A\!v5uz9ኛi0nȪ;l7ϨEq w-q<rS.܈?O5_gc0_zW1rYkZv,]49/ TJ@4[(X 6ͣd?7ߘOP9|_{G.۫S$o{߭\.9<폍٪1P7%n\B`w6OZr˶$>+õ󥖘\Ʌ4HTEڬ46m;s)i%7;jGSC]ʶζ魹nǍ(LkgvZ?Ց͛wǠT|wG!vo5d멀I9f97czJIsmL#1ݹ`kH*zeul+iM5q&̉ݧV4T1T=RspUxqR: cܕSw7! k3ǀ6>C SL^ 0^7LAM^ngoM6%u -lX//@TR ' uD[;Jm4tfͭQ1iH%mhNIL ד-L$3PShZNgikGZGwH}"2۬MgkBQRJjg*#CrR| FL"l45/*1U!;Mh&HSIAL@K "rjuALYklwH!n,hf!{ ٥ۑu&G˂OhZ%fx3X[ ݫJ 6p Ak!NBǥnmg,u2rQ,C ut%&公vqf.یE (T4QFcHJs}vi7;vIilG?`xwuT@ܼe %B&f+ mi-?7kI@ 2zj v%> %MKE 0n`Ɛ)uk03\Fc05M#\5G4̳p 1C% z97ud&Ja C BРB;7)p0GRFj,T \!Ԟp( D}۴<. FҤ$3]cPH/!Ej"z8$fdYr! H ňH֢P4y;Δ} t9/W֭4դ1c9!98PZgQD8Ԡ9!@ .wgEgEP?n+\tj3Xp. H[oP\ڎ&O`s:Y,DP:͡v@:Te ÃTH Hv9!n7G/j,$t_ * <@( V) !KqfdX80{ċd A9ˠb3 tt<vdn.f,T]dAX?3 \ŵ_EyEܶAMVU.kpߩƺcCm]z~͟N'ljaVz`Dߘb=2!߁]ژK`)^L9ІHT&j ]t g`녺51 lc٩Y[M1g(Z.3c[ -HWP!Qm mT3LjJ'x{a̓ 0 0@dLʀGf̱Sn,L 5cMhQbv PDm#+,~6p>ޭTe78ޭsJ5 l9)ǹTacDv"@g3 !"fSҚ8c @9+ R8mŅ鲮47?Gq3 O㈆QpFA5@d]ti-!*V$Go J Q^bQ.aE@(8D_NFB%iS~b5+yyE8~vU HǴ$t1WQM֒q(h3>#Ald}0ူmfvė,pF-L,:c)Q0YlWb3pƨ<)O|?7Mf1[Qp vE <`U5\VΖXs< +4Tcr.!wAr,x= ``ޖ`%(=&S#Jkb0,6Vd_`WEEG] \rַ`k)ape<`,Eb DrQ!! oRQ@[?0^E׆W"H0|Ne`U8rK"{jRBuT^ Tu |Iб866+D{LOy4?S݃2uU3cYox&+)AAʆ@waV>P"mP0%Ph{VUD10e+-K>؋Aұx.Di >%(n0%h7Ƈ(ңƞǨX-2*WzE.56 d҃u,567ADRn(UjJyb@t+$;QA#VEWEWLvRO׹t:otqcں 0Ѣ跒EGM/08|_YoKp{͎3ð'gnN^@@0 " J{$PRHD=F$ @B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" x@\PMp@\|ұ@55=Զ2zH !@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" x@`9B)`@@`G!1" Ӓ!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" x@)%dHH X \2l TrHG2p) H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@zZϣIL5N\H߾!T&~N J@Ɵϯ#}$h&U9}vef2*#8~R<ˑM¯#p\֋z.EO'u8_ ·hrB -j*x]^6Vr' W~v)üx_|+עLEODAZLÊ*d1?$k^LQ 3`M%r=,J w&=TGBmܵN=Ll0$4DdJC&(02dqVRLR|VE%o/#r9M%ye*Q%bqZY-6(*7 `u#_T̼mv_|H0X=cnZct,37i&t< _?=f ~Ox1\EkAjr5{;Dx?-ږ\"Bi\jruOxk,}3㙆i8{b{4T.j2K.K0q쾅'ITQ-- {XEKP\^x~&MY.9XC)%ga W5-*ucwΊ~a 4._V|9dh\.+j z"i0ipaE80ŲZC+޾U_|"]4B/F| F&!V1!DzaRXzhsR+vp"zIo|߾TiJUs˗#UJ١@Lsj1R\V,DgHU \JcjZlDHj, &=-VoY\I‰&lZg}uA9\)w8^,Q+5@n*Q(-Ѡvp?^ZlZGZ>\,#'uo{_7o./ƼYj !ie;MXl, aNf Q?J׎ܣZwd( oY]̈mCbs̀fG>Hჭpm8+5Îm\h3 ׾PmG3{VWDL'U{2izE#ƪ'UN*5`S<|`Zޚ u48e~=u(̀ګ4|G^kTMo@r·C& h hbgHjaBQMNM-RvN;8|&גp0466-zw~z^$FR#W*>tu<&,nWc0߅7-J멫fƣ^N>\XtemcO7nxMaՅA)u$ebVz5Q]>uyq|C0H]b,n|𹏈{VǽY8A*TxFJZei>fͮb9Bm=1PϬGQf(Ń XkO0Gl|Sp~&'Xk]UPy"V+y2)[1u4/KGuٮu* W _#@^='2.~ϸj++3ߛV*ی q`Z2ν-}l{ Q,QUTׁMLr] k] PíF_͟F8(=d 4\YmN3m쪽gtqЌ~Ȳծ?RLڲց |;I(A"ڇQ}dwz.-ҩ}3D@m.]~|Y%=Xߒ^{YQ)y9.C]X5kdɦփi#P",v2  eqqw49L[@=ߟ%R9MNcJ3Ƈgː^7O#-vbX̵~y;!zl9En[{1BP;+cjJmYSYOKgqwcvJ"pc0Gw*mj73c  獫?#kR^-L5Q͞Q#hɒXMsښ%F",v0CgE{9;`.b`wČk׺y1'/9W2CY)C903(mvr,C Ax)C(3Muqlx"549_A7.c?7@+KL5(ObyN=:J6u`S k3 Fl^c%²Yb 2fX*>k; rR)YPvsO#3h利v0e]@4X%Jfm"Twew.+CZ鎺\gz+_fgy4v:F Q]N]GT]C.H'ht*Y>ɏc~]3ruGıC;Ngs7$:*v;M秮8E]щNGmvd*Ċ\ / JdO$AXr:.x[6-cb<0T! Y0D Ȑ!cP橐)2XsUH~[C3> K /n2jsA((a 8 tqn5TL )sU:IPXDwS9~k<)%Fti.4,io\mƿ?~3d:7Z}؟>-R?;WO_#HNb_,_ge ~kB{?*/1$x"[ٞCkYk>!*)}}7'.7jH~!<1ŗ?~48.B~?tiE_wI5{Q3Y܏y`HcOC C¤L;'흜( Q.}Z`dԱϦS5+k 9hi7a-n$̏S uì?}`:ShOdÿ9-1pU~9$͞G^dJ{<d7 (1HI! Xzf(rܽ@!b_7e!zRf֢g] ?]_rPFudymzlً[(y|+RNg%cCyLTxiz$\؅ @ǔBl^Wk 4w;mL 6zB6rꠋшg5kvK-w?R*Z~cBjEe{%{"C; Fjظ Nh(A\˕c.mj~qceYbjGw@*ayt:Gc_nraļl|~m}!VЇ+25îS׻vEŵ9xl]ZwXkޯ݂_z j_v>-5ǖT4,i3ZзB4tAf^ A;礝}@\3 ]l7@?EjԴ7!U7T@|d.x!ZUKsdžvٿqy $}y{ Hf5=LVF먱s%jF6,m 1 IT`v@Ч;b\P zînmX4G/x깵d\{q0*,y-M-yTHF4K- pO#'e~3fbՏ@hRe nG3%}4C6y'IV-X!rXlA UnT᫵H)B Xm!D?NBP=ߎU:@& 7e^5?,\e <>R/;2}vυ藀JK=B:-**z"76.Qrˏdn͆;3%/&$SKa7 ü'dϺy>w\[ǯ!_f3 d~ ح}t/Q~XxEi@3l dIJ jT+1hMl5$^iȗnN`$6+% a|/uKdlܭAL׾fK˚ɞùeoJwȹt*lT/eԸj=YK =xhQK^v+ .¨ ͈Ώiޯ==l<~"%x־knnL2,hEШ4J˅W,+еOV[x^-5H華߅3BAzZDzs cRk3_It9 HƘ"M c64ErDcܛ5qMaRrLXϊes7= &쯛>qíEyϼJ$ 1pUVTx#2ZLU߉)د]k'z+iY۹2؆=GȈ'2~gq * 1ǁ6ށXWLo[HXlҩ6l$Fmã_-HU`.~+.4Kqp&Ch]2tdeNŭEUI2j(= (xVĜ0&!Zz)F3CJ^ap+;LY®:b&^ ?*yqT@d44J̈E 7ΛdrYP(XuÜ^nԥpC[F.It1 R2Jp M !\MuXcv]~-v˼*;vʚ[8f$=6l@bPO{ݞu*tH{RɨZghe[ZN[A/ɩO׭/Du-x:%.dЊaw=}aްJ "A`)ۂyhŦt mT^أ T>Ӗ/"T.(T |2}΂{.K\lQbv:өat]ggyKOLwv( ,`1"hQ2H5oT(zeyXjj<=_ļZ0 ^\Actu吘/uԶ|VO;2EH n}Uc2K >E*J Cܷ0kQ([2XN/a62CVI?d@;)uXɾғ?;t3Bu2Jú2޳1'P {<'@r`@ȀĒT , s(wNe-kmU!QGpb.nq.rմc|QEHY^H.Uvt2A!&:/_~HY."tr{ORTy޷ѹ:Fw外j]0z=V̀uĥSD\Ha1Ib+2p5ئev ;2YKZ]s:hKDt |$lNhmY獭o\h};$3O@שu3{m +~ohYޝI(4=(NJ!u$F8yoN95.0&Xn~f.:l8TfOg9e3Ş"ۣdb(Q`Y24S\jNUʷ"I`0'lܤ SV <ٰ DfUv>fYL,w29!B$/wʠ,,LQBBɃlܸR2J5}*]D6KbKXDA~,mOi&6(>ẇ>00j~8$ gZ2QЌ Cl\y?#$6D%|6&:jXJ \zaFu$Ltd84,mqMϚM:MyU؝V$k ph/^6ioO*,b[KC.J6JcQḚ^ySʸ!(a#~ >JP2eMend"z ©ꗚf*)g]>{p"'rXAg W6y?ܞhy"TʌQ ڋD<lhp֍Z@SۉbFD&?~b%E;G>oٸ9/W=>>ۜ4fk, 0U/P$[xd=3$ *|qF͢?VB j|60qY,`vf3D;؎ί]Dٖ N O,+iLptcܤkCY_Y&De4#qd->zϗ $RDd}>=72Bjx}W?NGٗ|54%n>fpd38uބN_Yxkȑ66%ŗ=9a[c- G#ژL?^ۼI2nR`W8י_"Kt6o5F|dip ٸ]w'":ylW(CZ66yEH|k. m8jFa~dIq*h6Q `rj>F_.R+}l"m'â'O?B'PhI{ rS:7;-ptT36#40f&rY”JݹHaoJhu[U[؜+w댍&m~r* '>T/ΏuЭI7QY99D.TNLk]/"_$ >g%+FI"M@LU$Bke"2ԃx]ܓ#'랶 l0]C.mz_t`SR-4Khĭ4(IAqbe3t\OiC&DrExW1Ozrr#vll'(zwPvVȼ2y:)eVfg;XP.s!]H“Ŀ9ayo?K @$(Ol-3Tz0n#!V!'3xAJg31UנŪe{jAl# 8[,[dTp/v_1:IU\'BƮ<'+ |Tyϩ9i_+_QhQ-"m;KQo]EhΓvjAQQ5t[ R Mdk2&fyaH\`^|P{8yƁrI6fd˗_d"8-'BYq߮1;ʾh>ap(Ũ<G ڑݰpf B 2PKaa/xtϔ3Aɼ =I .l0 tmu+w`+~Mo~?%$%'ֻª@32` W:(%Qj4s*QX^|>A9b@ œs0Yj]ICnE;_"RgnQn9DxxljJ'xNn{kP2y;#}y R%fP1yj`XH1zoU :2n򸞜aNV>h2gX@'Ti kykH2ⲜHZn6n~bki)rp M`D_tK5Na ,,2d9K5kI̔Ҫf*0dUVO UxufB]/-뱬mu8Rs/GnPOQIӑ}?x , WȸbdehqIV?8 89%s"!l4CY},xVx{+`䌣>^)l2]ΦNUH[ϚdԲ6 h>+d*l8.o*'Ϯ ١aЈa]$1K*Y4r:.B6FyLp\*Y;寞7U B);$v*nJB+ 7]MĜ?ӐܐkBFC*&fBƭm`aU4ψɵ¨{Pƴ7$N&N\*ca!Ԟ:4|N2i$bkA~ zI镘 N=SQu!\bJ1/.(ddDQ$}*!Ԩs矀L-{[X$B*""9e+]sl\'TuЉ%|ue2=*T#H0 u+H'+o ŝ}M{M4X3hUV*PRμмeH0G#@©l:U}i/_d9K7\@feQP2K=ʿi~??"Hm#/*D=.ôDx#|?w y|;gIgf?%93~0ol?CWJL WN)'t^gmqR'ʘhJ&'ӭǝu}) !0NdjNݚTh6E'nWe"V&29rĉ3&.2)LrN_TDA\=w(t'͌a zg(: W IZ|d&^jtXa>E?ig>Ƅ,J*n2 (PKet+Dr`#Mq?6РnH@TRL?UppY?{M0g=GJZfeʮv SGßh7OkVbR <[ioxV˹AoHEɧG9#ʛ;UK{Ν4w7Ԝ[ =Ş޺syY5ݭE }kMދWB Q+샛_iwJ_ǿ~FؾȘX3ÞXr73S@)Lh̹D._+)E*p忡Qy@Dњ]a]4 BBgE;l m꣺^)*~MoY2O=YDM|~이 Lgnc. xȶi.`bV ̎YKRL e?G BQջofi0B&\IC:sLOs^8S?rKf@êvr\;VL@WhnT|/Y2=kCGٯ{]C0?^?B`܁e0N(FF*dQy j濶yܾkQLmMK/x@Zr^X~B;~kuf IPw@z6L2 \=Oa/;7x?LZxmJ)Eqw F>TbH |zeR y<)DFbo4$!{|?W*, tɓ1RQmvٜ{ Χb {Wȸr[)v9N>݂iGz\l;VEfE3.?;4I#Opư "u0,)G4O3+o ocȯR&Deɔ "je7CƊ9E wPQUG"ɢe:wRwp"UQzcedY&ƣ`vE~]:qvަ㈿XKB3ZFQSRsJUnG2v鑿Gw|ަMS镑C߿2 `'Imjl۹Vm?4`%Z*dbRA;:!d  NiY xêXtO-oǍs*|*B@U?(74f7pCǮ nV0b6w\`x DhC} P}8G >`݃o/pw2>^)x]2}UhljA~wH2A> Ab噐ʩW%me2#2:"j4{LD̯^TRtJkx,LWvhv0C fOSQ X+u_'.`.[oK+U/ 4RV;Xz%KG+|0i@vuĄ* :*F1rq\ò}'d8.%Q\yfбXy$?]UuhbP[.&Z*.0Ҏ~+D%*DVbϑ3#+u!\@-8%B51h'nRblcъzPg+kVC1(}k;Uh}/ǁ}P{:(6N^ Uֽ ӑ4ϝ-֯{lK&mn{ jcT 8⺗Ԁa`Fa>n_ hrCLۃ4CMc;[X GoMXE=?;6LJ-psoݯavl6XFx716Moτ]} 6.IlYIw[qΛocNEBk? fYua[<k ʙ5;3Z.eȦX!VF2+ո=locD~e/Z1!ZI[zÄcY ܯWb%Txr ٯ?yG )`c,xé2C9O/oI1Wq@'q+9b'[  ]{?{}PCrfH!x<ُ5FXh5.)%_'6*yYM((1fw{-F %2߲s2L ;4FZo$6+|W,fQP26{[,xm> [±򬨴Lb"9{XiqF1Q[%QĹߚXђ8֒tU%V5TReYQ1)@>c*." V N%gT}oG#pxo:pӹA;SheưW]żyj酱j+%ś;q:E\ƿ~.K {L(Bh6Gߛp_Lf!wdp7>1,P O_B~p3O?`<T{ 3XH cz_X?7={ѿ }\_؟7h0o&~z8ħ4džLq tZHD#+ -ot=3Tk_V6oFϢM wV(cISvY+otX2{p2 l~mqc47>G9?SMa&?s,7kƹ{?lR^qi,|ٛ+>U05.p'/q{%U񽷗1 s{+ըm'zK{$=p኷S슷`YmCv!$QO鸼)۹Ťn֮e^7Um[jPvaT&˚uH]u#`O\).4;q3I"5;kq)kR]\JTLhf݊GS>{J`.4h&h[r%S,}ydb4z$'TܶPƓץtfd>5ОH`GpJ+8$7XJ!Ev4=L)+^ZdjVg?H`K1}l3Xwj"qMz4?|ܙWdnw8tW t|M^ [o]A1Vس ,*N߼ب ʊ0bi&O[$jJT&ఌhwWybۯmcd Y?ƻ.Fc ANi;fLsqVJ F$ 0ړ[ L%ϟ#Ʉ1:pߴ#^n2KGAiƏoGՑbϺZ隥,HqxqVbaRdD)k]f_t(@2i"Mwlq)1 y74g|=7~,}{?;0b9eSDC083hQF#k-nJ%Q'6b"=:| m0u]߹g^IXwo=vTՁ̵:&y7 [4 %Ui]9lmZه6$WqHօUJ3r 9mgB! )Z4cS\ىW1zM3+MAX;}:%| +ssݴ*%IoR$\1}I߸*+)w25 FoEԆeY|]zʌ›AkLiΦ. a8d7YsynlOϷ$Ҳ` $+O" õ/,cj`H ) ;mWSWbd~^FbN]m!r'p# qOAK·;C), Ƣ;^', اz'7P i5*ˠa'pczo$6z+S` *F1ꤏ_cFګ/f䲽Dvwǧ< /< cx}&rHS͐/ ) :B%Hїk-#p nӡ\ a_V C Y3:͕ĺ7w}v4Duth)~Ŵ/k"Kf ZP](K]g"yI`bqQ =FY`cZ0^_f˰&ڟd6 -h-(. .QHe>ܬ`*7icf-e78H]Jky/9s0M`(|5.M)V삍?תk)pםK /OVZi%rKUk7ki?}o~]uǨ1~}qquJm]̐cZ"0KIR9&B7&48U":7UAIdxMkAJ`?WY],̦ͦ3iRJ ﯄lvNC맧EvfbLT4d-/^B%yɘڍ .`l\]+d0>3_-,35IDљyTfArі kQQlSF?0^?K>JiA/7_n)r6Qm4t$<Zf@Ѥ;5(Z U0s<7nVNX+_J*V6CpLzU{IOW_q]9?@2}W1}Wհb;!NqúOIlX4c@x 0PJi H)z?Gl~_^'z`> %αp'>~iz=Cෳ97psos mTTJ[VCۀR*[ *Vmp"K|/_F4v +Y R &wȰ"樗/ Tcyd!צZswv+R?l^SF:Ё`'a/Q&aFq-23Ԙkpģ/)VZaޓ'3BfԠH(t і"2!kU(އj\ zrՍׂ>J&ѝ TdOOl۬B &TLIbl^]񱄱rbxW34Qye)>G 07FO"4p0RyvPKC#i}a:Ɩ.=Jbf@ OnZ56RME *GFKRFyf:F`Ju.#J1vѱ~{_sib-9ODj̫ז;F_L *Z5ھ*=Js^ :* 7 ?ӗ{3R(2}VJq/':(|5eU,>WZ|P thH(A7JCiK (5K+`U:˜0b^.9P@G[s"}=ae U5WZS SwԔA |Vi\DTɅ> ےZjk tQG#pw8d>ql$wh^NC pB"MxJS G$%CtXy nBr=݈I?R6o]z*Gzۋωq`& # AqUag%p*'P STU,LF w׃ Rz$,`e1KK 8E, h"~fZ?IiJa8,("r2psv5 F382z1HFEa5nwN'(j RkdLC]C CcZ3TI9)e#/bc sXU(Uhk{LpTF9吇T +.d|d`. Nm8nG*sJIU@<Kہov4rr|A_WK !Fr_BBhb k)ܓڱFp?XG[$/$8B]F֎@^^M0V(/8)i xOA%RX ݟ{>M`:|8­0%D7FSg wЎʜL}K"Io)p[Nn aBRp\E`yxAnY*Ny2Q,+Fo5 Ryb;ʫcRA0%څm'l heB Mi$RO=Ϸ_4,h k <)E3w0x7)h6Hy6;R`##܍FcߚZpyƀݫ.8o bT$$rSx'hku50^Fda- nTk@i|z;3>wf5&uYߗ)=Ͼm|H o-Ʌ됙DҥteWٻ6rdW?`m/0g2X,.vf$t˲"ɱS|iiiY25p#KMR\|:$dTyрg3t_MwQԈW&$TXoլ􃖤?2}xe",M<&2x\PSaٓUBZuYK+fmN@C AY ts940ވ2qJD9M(B.5s jM+6 ʠo=Vl5bqj$wRu\O}MJ}*s|=^7ɂO{5sM j;L&|"^\Կ'zG~1ٳr3&aM: bTɋ[/.V8\ A3 -+b8?@L. q9ӊT^m?Xta~7vYGr2apwPԤHn;Z v/wrcas6Gor6[sdOaM0٬/`_~ݠ+ddRG1ִ.+ϵ # yvӢ[Q(W-^ܞ*ޢ娚IzFm #|4XY,8zvPFg5A;",G`i"'uHyR:b F(n;QFm'Wk1b$J=셹k13_LjiTh Cuiw E2F)-zB*iL#gn$UJ/3$:$cܥ;cp+ΚIQ!dH_fz 4Th7O$җ?z.=yf|&z!϶7"H2ԏuCչk1O3W;`tx$I|L )r3KWspzSp({s3 !C3Vq']3S?LqTUndyeЪE=Rp{ DS38p8"תLQ-ٺm~y6#cwD8$6~N$'a,(FAH(*jx06273`hO_=0d4pb$@e 'ȡ3_Q"r~:RKo.{ܝo&CzؑxٍpS4%4U@삅 f.RQuPLS!Rqٗ[.#yqR]Q5œ<7S+ho\b-q%9;ӡ굔$%F* ZWfןCK:W5s|1IzVx I[Î,3:1z!To+dhb=LÌTLq$8ycr7Csziv%Y:-s.VZUS~VPr4`jpvRʏ}QHߠ}#T6YY^C%@Ź8IfQ&p୷7DÏ0 ^ޣӼͫKKHҊqBD5Ie1R@,Eyyo-]jg #:0& )e;+^Y_XW>Mqe Ɣ٭4%ݵ24&$?6Č $B >]3ҵQXQx=cӊ :k X1;}ӵ]q&Rץ YڇrǙvy a(6]9n4//f5]3 C#6jRaj6@ !f *션EhݛLAܡ 2 ѽCaG6B.wdNA/(2py>*ҹoyҥɿ\\yymNְ#!I!'"+hKָk>7[s[Ee""ya@Cԁ #OzxȍM/M,, ,tqaS NrOgi 7K )1вD?&A2M!}_^'Ͽz4Y^^-Z`}wߗTORVa(._m' pqw1GVVBnEYLZ ,8.L!Ty)kw{N?֖.PLOʞ5X:%cf1e()~8zoۼ㯬§LnP/tЪ%q@CJb~ytRUR+铪",(TUcfl(EEI,_/mT\oD9\\c\1XהWrH# fHoDHB\k67,و=D4Kn)jx RLA tf7(qgJ+AO慓=! %uNP\ϢEDCD_{}u\J[ v(GkpCrcoi?w=׭#H N VpdYܬh3ѻ#e-P 5ځlAHŽrO o? E)\oSZMwTь (j$mɕ'k"WgJ>-,6` Nhkj"`U3Gmg쒁|;n_~Ux% P|nJ;s[#|~k~݅~;eÊiz"GEU^It661fןC:I5sh^bpD*E==-br\ 7fH =R*=on+ϗ.`}) ,+!Nj^ZKpɤ+K`eKy8ArG1)2>dT\%-|Oc{idRb@]`^eW--Uu<ާ"|洳A`sW0tgZmR0AHl!UuyF( "␅={<'26ks9u*X Y+U1;#S UXp(JQ*Sdb] *HfF($Ep1bŘLLJ-6ʓAӗζf#)åp7hi- EU+u(E0)rb)▷pFd 7?ZjˣVWG]HJbjϮ˭A7h$SfQtPx˂sOAQT~gX8@\bujm\{^L[ft+O6iL\! 7/;F# r[xh2C2xr G%aJ s$ &T[31v2 f!Q+M#;{Rd`@.-~(m .%.qJ,"B €4wCXi+s+аB|l\}LS\GgIWzPFg5A;",GQ$ugTcB {|J[{x|&Q}gX~*f6Znw@2hǵbY#7Y14~ݟO8xiwkzWi%j'W_ di/ߝ`tLi'1&3ć"x֐1@ k/hԣSveCg7AmG(:._IF?#5&Y2WCRua$i¸dooAZĻHB]UH= 3w|\4h1W~i5q2L0^vi*8|^q#8*fБ>m5,Zhy " ׿ Ȇc)oϏ{)X{8'P/oUx·/fX\SǿOE~=n-k_ {5_qQr\ T(XWck9HeMufD Lv޵x1 /p^?XK7B/#=G(ika+ty?N_7[ƮWx י1l%ad1Bww~5^nJ8k7^ĵQ$ȳMP҂U7ّ4%zFo/RL_ s/n-[&:t-ū!6\pIeKl)žYgh_%]D^N|`lبL7lecfHm#KMn"n_u=]w= noob$dUQҵZy)LOxppMhk9,tӇBM Ҫc *37n#R%ϕR2<YC2;Pˡ9LDcT9Cƴ\P/c@qYb^ѬǾ>x鑶G_ ϵ%zc[3[ڡ?|_2Գ*T9&ډQp8u+S5!P9ʋz>&VVS= 3CY~;|ꉂK4鞽 o󜛵]zON0G87$Z _mjqe>b ʟI#S^{J\UO}iĮ֛?\̾cRo-Az{î$¬afŃ,?<{ ^9r‹[j&Z2,缐s|&&(D;NKT+(V>+s`J:M \J(B~S? xkXvtej97MvY=ypn}4>XPzڇR,[FQϊz.q͑1YmZ ]abJkto o==mv=l#ja>\\C* 6S kv_-M7}".O=q3F'}5qZfN.~xk4b>6(0g,NgXz;i1cPQ.5oo0Es\Bf7C bWכ H}eQJ?wJ&ح1k_2h3ϹYRt*$baYZ(bQLﲧ}l>[wҫV0rlT[qyԻdKfPvޅfVw/:(i{az׾|s%gɂk~Lpd:e.Ϊ`s Z_5bA4HG$+ ĔA=Zс+n>AHM?oqsla=u*60+߽Kv=ΰV'f8DcL[,Бs($RI' f±:Kf w΁S%wp!T݀bV˷J9U]Ns[4!;ZLo_mbUoB8E]^6~Nǣ$9Fϑ]†D/~O{jSHtnú9ez%,Gx.ЬL>zN R1oW*7 ϵʻsZl"GrU-9m}{L}`aT=^E5*}QCJj$s06 G{JՈT@VM.^hѿpf<89hS).dJݛu袌?#p9g>7N]z׆o'D&d{8WT*TW%_ɅINpV-p\T6O)m,J.KbzcYo X/|ʇ)V@ _m9Sٳ jyC4~l&T"5Fk,IC졷K71KwR5mN#FVEU/*u׬?n>ɫFO޳ؐѓW"]l0XޥHR@9MLv? vN=sIUgIM#9 a X*D\slq]%[2aAwwU-[x+i Eyt3ܗi@SM et ~K2HSl֌%kY'v5k9V&LeX3tV^ 3w R0,ݎ0 >E;^ؖj֝aDJ3 By4W#DZkQ^=sJkG#![/ BۣOj bd+*h Q3)`ai?GzgށC1Ł-uj@bc]Fn*'j8V2캂 gMsaiB,ch[DXlJ #0g02 Xtq / @]B{ `DԨmW0u6+j2)`'SIWn#ޥ]XBi%Yغa-<5&̒`|,K`C8,'\7=Hx]mLSƬH߳Κ\72Wa@&B[ &[8o$|~` Duϱ*(NX|J`A9G8#i+FKQ Y3ֻ;9[e=Ioʾ2~JA+4UfjL+1+SAcE"k04)z xx+Yn6[)JD@Y T%B/Rh#ZȀ-Dm5oD}z{-+  3p%@'fDTI]T>sz{W] #%o ̷p؊c9(UJ꫾}-V7H@S<'{6h.[^ycZ QJX%2iKYCT`]py. RU?J#e.ƙ`'x:ڳ򪀃 *EƠή8 uE#YP!7٘- <ؠڣVm-"uB=wGy*0;gȞqWTRo 47$\I ޶[Td3 ̦8Z3Hf>+9[9,Km0Y~5f)$_{NIƭI#>T_PhRK 1/XG:H;,Q=)YgmkNC%Ldq19!0[Ѧ[bvĕk4hO' ~ǕQPHAѿ b(}>߂?8׼G)(Sb001=EC1S/ڹAI]Y%A<{ĩߩN+v࿫;z$A"<ذS{/ĔRgvv7aNbR0p G-["M;۩Q|ev_90O8S "1sc/*S{/΢O[osoճ ]o`{cyj#V/X=Oaˇ/D>Dr΅]>\JΪ\%]>]>|(z6G;Ժ#!yĴGwp/%0gjz|f}ӻ+~hAչ};' eeTɒG`"Q ^w =kxL%vz[4M-bGS^7w x& ckQa_ pĴdž0aQdžwjw%SKvj'c5P&P DL%0pY[/y[J̊re@y^c5HRzf$3qoO7<nT rZC6x[" jFb)OI@,i 2+*ήRm}֞T$^@=(~㺣^[;=T-O249˥?[|(V, ߸#˦ox68OK njnM54~?14{Vfz,1z@"(цQv4uko 2J)w=Fv"Khc+|H/Q&DY 9æ =3`ʃ3q PeՉޏēb8hpЏ:>g1}XsN^/fPe]@2gHs}]c5n< }1'j!;roA)$gl0.z!~a3le!3P1k7:%ζ亪S7* $$Rsd߼kv>@ "X568^4D&;{ ?mm ޏY8yˋ0f/ލD@aQ.r C)p߽$X)LV[CJ0f͖Vpm2&8,}7=0xզ^ĥ$r9u}0[0MsWLt#c&Vw04VlKPp<8M6UN]za~f63asUfW2j! ?]$J  hƃx_vbF?iM+3[wOʏ|#gX8(CQ;Ym~{$L7MO>G_zOL+1`0gȴx?`b#HJ2̫Œ@d22|d24ު9lvX 5Zl(pԕ|Vz(3\ "&RJy$^T4F $ DFfkpyr^{,݇P%\mG})D V1QnHd ^w;6'?4k9'د:sOwtϔb~ ^޺ (Lh$'%gcvpS:L:F?̢bSRO+,'O`2$1EVbtygBdAX f1}ı5hcqZcg 6{/,[s5z7{[arؽwXyc:'G?JW$x9:`m\JeǥRK\ŐfA 1h2oplY϶]iؖ.0yȦ ;rȚ<3@IԹu[΀|{܋yֈQVRB-\u̾C,r,xR7vIϺ!I>?]Nɼue2,&5 5:vu7F"#QgebwRm途tu-j|ѱ).PS =N$yֱˏ 7*A닞Ix1g^/2l[ʰcqZ|a4R^qͼ{.0,E6.ƿ.~Pp-H2T R[ *yO"5U ٻWz R.yA<A*;ƒ S=)ġ}եT ddgriuԎJԥ,1ZpֽGqOn?FG͗Wq ) VCV, ڊױK2>dAWVA%QQJ:ȗl6B1C .6m8\JOrYefybՀ;e{uqrXnˊ f-DJtٹT d&s:\[*:L7SoybE)ݨhܮr=3)O '8~:Ex(``@qr4=n]?s؜%Rl<Crkm!}HAs uD!VL8UZ+%yzj,;bL*m:*v;ٜDm?œQ7Khύ¾kAcϵ ʉi*-`8>-p%ɉ9!EjSydr(j.8v'*0~/(đ#t$=S,s޸zћsn`.}j,hvݏy>څntLh:DQUࣵk%ɉ9cDr3KWt_Ҡ `eMG?#X%'ol, yÏfQ_v c%uݍ|j)_W+ZM=jH 0;>Hr{NrOT'UZ۩bjO<ӯ9'tUVbP/R ؜hԭ4LvLO< 5 UUV$^":cًoٞ7#s)a)t`@c qVw$91Wt5/svyJBNBtJM(Xl2Ռ<1 ;b$91gKP,Luپcou9hپ4&A8UJ 6G;olK+-F`폴B ~ pFn]6pAzae݁*R> g7?]6ҷA]Dޗ| ё*atKHu%9GӠ~F>xt;'\F=۫1&s[WXٟJhbIT9`%ɉ9cvFdаyA.n@drqq2+0<ʊjmn4Bee{u'_ǟ_|?QW%$SI=9#dSv܋~1uO~KIfFu\ It ɗZ+]^9'Il\!6I$kߧ( VehV7mhh WǨK(jjlwЂ{H4t$Fj*Fb !Z`u-R/RPW,Rs\Nؒ/Pf{ThgQQ_(fhˆ,-$ʹ@֒Qpcu Y&(=,/<YVĤ_UkT\RÚYT DVREjx{1(1#(}bRbleG!.>8 މ˨R#AyV#pao>^|_o0a<V5t )."DB4)Gg#0м"P TK T9QQ+*Sr%rP~e\>/<Փ>h_ L ?LB Pa*ezZ yC.ܐTgX VZ\U~}qZgB%KXgwCKCsΙ WjC(*U Jl2N+"{$\\q׬r  d>lZyv7P5ح̒%.e9ה_3<~ImzCCoDQP'^%.%!Jm]*ב'Ʀlf jo9Cyd#Njv^; 8ޫ\{hG׉RѢ~,z&Ɖ9=0*Οŝaq>ݛ/q^Uo;/PQrIES>IobFcF&!RT8.dQWPM=+WM1ZShcʥvjࢮD<إ ѥ3H|ƛe>ԙ31W?=ov[uJV*l3E6Z)T7ĐUjJ^Rֶ*NM52, fK\};2Z1]"MXoy)M%ntu2TC m6(¹ Ҭj[XqMR-j)e)*ATF3B©4;G>(x+Sv!K#ew̵1Ϥ 0Q$Fv+INI]l3@Ыw]n<+U_^ogZ5)E lOJe9L -/CU8Toph栴B(^dmqUuqVs7=~uJ5VZͯ dURnP]3^bQ*ԬFj*uWŐ D̹&`G"beqO 7>?7Jh$ %fo]Vh`wg!{6e*8uOQCX΍JHpuCR[k0W**eQf7{ ~˫}uR՗.@lQ0 g?Z HPj*Eq")b.oV-ysl *"7G%zɸxܱj3i2@-\7;ׯ$91' n#On: ër2Rp>jQS!qvr Z,j^\xWp8o l {R5j3S@y9lŢzO]y[?,w%Mp\m[?T?"2).JKN<].>ݦ=MdG/BKVRR]F#]*%}cg)(ƥ?rtiб+`5 ܏y>Tn*OGǔ32H8PBl/];?yUs?q<:]}u?h)SЅ0.eD>rbN M)j[`eo̾}=C$y{\y>Q't86NX{c@st?v 0nXpv`j̳g8z"}33@QM޻z@Pıv^Ǒ%{-dIk9EMjkCryN0:ae,1N$9{O48cJ]|isriOff7e!r~\oMy5 /1V\^rA1D _&֎Q1%_`H:? ir1$fS Y9.!K㞁Lvs{lԐ&rRq^dG?~-Ӽr_MB烋VcG!Z͘k}j?!vXن7cl۠R~io=, _jGϘtl|7G_զ9{Ofr+@o`HsN"_z=NBV$;,wcaF[ ْ% )D70΅Cɽ܋n|s獿*RDV=e,Ou{=T-H*E;ms«9Ϙ g>鳺{<\rzxyִ lgvFZ'Nsiz{M(yu7~wIMuyvsrzR=~JWub{ڡ/˅4җL.7@ԓTqҹ&9ONZ\ sRfRo͐ Uc0:gEnT(\rn Ug*Ec#{塓,ROx"|˭=.> h \,>"<"1۬cƷ%/ݭiކ,Ggx7dy5 l:3lɐ'rIIӒ%M2jD3#ow`׳a : "O۽ g 6_`#'[r96B􏳳"5Ze5QT:F9f'ȶ@!oqSXV(tgzCs~Qo32.ӘNP߽kagػ g3uFWaATb{Rj]EV18v{zue<:֖?Kߚ hPc,(0TrIxR“K9]R Jٹ.ȋ< )$H<2d;*M,Vyv Wj-wWv+C+)L.Rqvd¤ܛ~td>.z/e U2SpQ*MV/@EJU26zEJ ]kBgc¶8&"1خU#ZdZYc2ߢcB81؉ykg"| t < ~y4~OAqy:WN釩Km>(KHh$B>ʞ5R(\sI>귩  k]Δ;4hǑ4)\[rvE]RxMtRݔϻjM(yqrs&R* /|d|?y[\IE6hɶ381bnoQNIHº2*2(WWAt*6EUYZMԡb⨭m>Lаh)BXeQj`)ge,yzI17y>).H?2gnb僥S F[mYA,3H5)ѭ*E%D>iD`ꚫ`7>*<~)[[Ԃ?}n}5GQgt5aKo!ay-a5ejx SؕckiH_%Jۥϩe-]ɀ[ ^&`0RRnOeȳϳ0ɱp;ޅ W5v%D~/XXT['yu}l Nq. lwm|f2%@+Cܤ/<Q cKM>)`}I6zidiRsA%* %HeCcKS5G7]*FaRNa~@in)5k5#\g_U Ŗ$gv{\ߎxQu*AEsI[|k%v5_z"{Ƹ. jM'`5=ɐ A6JdL(ytmE1L1Cd+l.5IӻmvB`Y*!6&D7EuAITO*pP Q*B3\jHaCPB]6h|꣝R`.='&b))>: D++G։yڻ->VV"}^` zh$'n|F6qw=҇>EGltgNIoy@'&28iлM";5 m Lv=H0J{d<\P NŬFx={[hz&C1Ijד-a*!n$Rqh"z9t(̮H&+/ @) WR͖Oc 1JܡS,+6n*Z&89g9MUryMNVdӔ<'9U>Ι sqeuhuc1;; %u`Y)TؔL7B?ysL}' P9 4(QS jD(1Wc&Zp`)6)%Do7Q}Q޵R]eD[em4@Ef]S] ۨI%bm hz6+ھ{Htv3csc:G`E`ļ\P;v/dBݖP`Q#1JZga1ii(Ԯ߰}YnR;V#>Q;<~&vϘg!8㤪P-(nKZBF0gY* =K>vs9)طԎ'jǘ5[<@6d->bj@GBT Qؐy YvfgNX*jt6U=i~קiʲ(tQ˸A偗)p,gYpFW|#r%Q9imp޺i\j+XЉq4'5L!4nnƣǺH6 DVU^TAe6AE@@DXXM>xPA͇>b6%;*%^i?}{0+%Aqy:~oSvnq=8cW_s6a} D;㥭6tÓzEߗ A*yK$C}g_%fPd`jY,:R5guaP`p1O˛b&Wsȴ9=Β.OEތD /.U*Ӱ'sUCNuo=ԜϟyqxzY3S",L'uP]-ojnFMS}srC͔#B.N!BTV8I98,yѤ Mo@$nǩ`c@ub^e$=AB 2dNJWz\B눣\IvmlcFJEo663zVp#zԠtS۩A㯥_UUX,<8Cl P"56rbPvU`1.;ٻ_Ǻ_F'ߏՖԲՒ-#Ȫ,2CZydxWPBq;)5B0G'VFqrOE>1eͿMgVj x+Lp}գ8\/~D~yܒT!CGc?!M+8p?цpL~NF1YqtgI?1ꓧ5Y"M%?th^q . N}d\]寃kBJ=!`4LUpH)}DoJ /onRcJe|w{R \ù8)Uc9c- askD;03I~+gT e~3O;4 ($!Hz `?Fm5Kƺtc;ZΊrV;_ڏܙ4deŮ`gjvZ4w{%uXU6:`n1cL:79SRWGu%k!/HuSNW#=,ÎgîrLTP0  [vl5` $_1o ݥnB aUhlO3c|R6H_>/tt<nCł&ud.$'kZ蕵lA݈oT} 'nGX4Jqw+Xsᚋ -Xj%2*/bi2dݝ,^t_&?"|~ P%va]Y-=W´Y<\&lVːmRSAcR%̕ZWsи'ZךēxIpIxgFv]5ݚjk~ZsyPsay掲MrGLQLtFvV!He%x[+nu:H!F9KRqh`$gmle (dYFJzزcC3TScdU`B_j;]5|z P+1"~%ɻ8hKO_ه16'Xʷ>=?@z)D䷱>HV~n7^?8뿫<KZ`t{;Yrz$lOXf4W-_7&TinVuoVqv()YWiIla4?ɽq׭wo6NQ L\"дr;(uz'K4c {(C*딌g*4RQAFjB Zgq~F5}B$p]J+tJ-l4(.Fd~xX(ge`e.Pʄ! *W ,+$jL۔3.)l"-- JpdI,fN2гTZ\r%| hL,]f"|n}^NVrdԩ5 EB2 r VyVRɪ3UجIth.>/hZ:w*34B<PpUM d`ȆŬFv$εxzK1rmqzz?ϭ"34ժpRnN@!Ypibh9Xm)ټt[[0t[=j58iEJ<>P3XZm%we |΋P}T0y\KV0x ~ xaO#Ha~h<mLZt&fuL A1O#&gA_i;t8݋p=}b`]oOEz ={+:m LǷyUwepc8d8JOy 4lK_їbЇ7!%z6Aկؑevyc4tm"l16*^1p{ZMS"\\ RA.LtFSZk߆-)cs{{ Gw^5Iuv,YYDߔ~ƼX~  ]#8(nJvS^#kf}^'͕5oD_Wh⾹ѮE{E@}̌l*"5o= u= IE#eC94kJ>'EI*3Li8St>CNb?-FA^+#IEq!O=[=b^8&Z6SѤuĥߧ[B>`f\'n.@|*n>2Ue%GuRr ai `@S#] (SN턋Xyv*aj\C\AZ4۩/@en_.r;~RkmnmfU1fOsg9ec,[*-{Cwl;;e< sa.b w:}|J%`CqWyaScP5X]7unK}5fK0킕$6d\qFθff ߜ1l* <&f 4bRF qlٮ>m$~+fK\31=;+r)]k~:8 _(yƑaї'gibW?MFybFۤd4x3z~ӡcRIU٪wMYNGoԡsܜNdSg_ks&1UՄy(hA_/YI3 cѽyFCoQoN [2?Z$g`L7&]?f^jȄ{yѼI*R6I2VE̋ZVTqZՁ߯k@POȟ**En@4 v zPK@"vK7@.ڞoӆCA`m>=?mz6\򓰣Ⴖ#5MfǻG[;izL FdJxisLD;a$Gd)j}V*˵w#I KzjyKe,{QN*H0ZE6)j3urGU>ЈGz}ɂR||/B_s9'4MlJt(pt]-tx,'C"jb1yޜT=u2Ӻ:i+SWK9ݖ))Ҁ(ij &d9Miվ[@{xvmW ̝t e? k Hb?[ɗQl?®e}8+}n.t|mƛ3ERQIL5Zvzk;es0`>V?4UC1y¦\.ގk'XgYoYӉ![L2ÖŪō4aR~ʚ]M)z3U,o_InN;LX9ङ9MDË?M7"L}ڇyv0Qz}VAHZѱ @)\VZAټ9Wm#hstTK [_V6ZWvIGn-VZT墺L}E}*LӈWeV8[U^8ֵ ý4Ѿ4"u'ڐTЂ6Pm jHpDII8Fʿ}4Xe?L@j mӞ Dӳ/oprz0_)#ݰtckw[!oGӛ k 8`5*q/:!Ͽ)ܮ:?+|D^p"i@<8w]{t`- 篿2bj>\MΈo\D$LDIsVӟ^Ц[\üWXIqFc&sO[՞}My_ĝ+C2pAT$ecb)=}QgcمصA4RCiZ2˅H̺T떢^mf5ayF)= /Rv!`RԊg|'B:Z6[\.O-c>1;vO{orM8 L13VzߛM91Nj4;K%aycv!þz b \.^)Ol#,_N2 k) =M'VJ%YvyY~c8UW;WJ)=vjow:rH* 4w>"?p'|e"\I/ԦYЪ[k5(런QX-3v[r5o4H ak8Hu-dcrѱfRw9xUÑMk0'?1 f|sFyv=v 6D0/ezny sv0gE+y a YFm7ԮԎCv6qYp0/o p!v[ϧ+NJ0Q :A`aPjԮ߂3JzNc۟z<Ň2:vJ@ DGjCƟd^Jox}CSl(c cs8e->9pL&V%X7v;c3zrCby߈yV)c}DhP'x>< pNGwH {JЅbX lEVCr%Ȁu=–h!G$t iޠGsC );*kabJ`E+FCu @ҙJsXNgC෯8ӽdd!!Q/Cj\Gh:(U2Pw9E%0D+Z82&8CT2dtք3|@m> 6*C`Fa.EapYv"hƿ鷟nfח+?.)deonbÐwqRIFM3, esf›0` gqoM}u{<:sXBo[ cf*?}M.fT gcMSoN8C846RfZ 5\Ȃԑ~]^:o @oȉ 0ATUfxcws eͻ[?+`ok(Eneg=eFUT=j9'ن!:]ʛ~>uh!Eࡑ#3"y ؊\`+J87radA$F]vԮ s<2@:9Vfol9Č4dy"tp}*Wfn#1 p9z6~< hh:|g[Z܄Uگ^K< jۊVIހ,4gp]VLhƙ-dHYne8 <.Ԙ-k{A5;ōf= _+XČmޮ!`ϗ|d6& )Dbwu*hEFb[ tY:dBƀ*Ot .c&D}7mc֩V'u /iF}4ӄoTG/,_C<=$3u`0.EO&-y9Ҋrvs1(.򯢝||Y.Qֳ _ME%N/%ln87 _6DZjUN۝ayТ`m*O>F7xsdRB8q$k$zi֥,4`tyρ1?-ݹYy6J`4MLCK(@0/ 5[p^k.Rh ɽ"3]9#yrӎpSւƯtVЌu#C(԰Wnܴ0X:38 w1ydPrG!xrJ:Bcڭ W9`lJ5Gc;3S`C8jbm[R'U}wt8A*V̧7_Tc6?\E/LSvYP>Rp5Qƥ=Wy0.oo9>@̐>DW.˺bp:jmti̸aDfဩ٣"4f&E3Cr\XHm<G./ w̄~.'OOF./e,OXlvV6MT49Bd9\QhIx 9COfҨ"l3HĦKl>zA=|РA:g/xG^[S7F(LGdXv9L=l3Y]{T!Z]ZWyx3ʳ?v.7xD6Rˋ?۩xVq4Q#*7i,`{µ,K},\ի6`Z],ݟ֟/_>\\-A?,/r~Xt0ۺ/\ʎhI+4~J#AS0ԍ|CcZ9 "kdM7qg7㷙mye 6#mb8Wy)M.fQFYB3qٚ g!]Sl&`'yH|fﵻvRou:Ǒ^U{#hu6@A~%}hibimەwհQ/;y~ M Zs> ntLW9yoM0\Wn3!Vn#!Q;H6b3ZN#RP;0dWj sABC(NZVj74j'/6;3hD8&̋hkP\P8jWm+~*ܷ"8f?ߜjwT&.Wt_"ڟorXLƅ#3rv+^!)b^LS$|I'1(¸ҿ\XAVg :֐IWB&bB= [sh`%! 4GҐIC&hAEP=)AFhd3Ɏ:qD(FFBu;%vN#;#,,A^E'&C +SfZpZf' ٭œ#P}@ΈQRf|c S;Y~H.Lo<y 2\8n?{d#f E9OJ"%|Cj _q/,`zt_Ai\> c.EP{:+!9(#ˍ):&uFƆ湆ofOP"`CYoǜ9.F7 نg\{TPuʣД|^ξ eE9i9ϰÔA-~V]϶odz]tMoxFUG:0y68,GSv]V{~9xyKIV ٻFn$WvRU$Afn,nIv?;㱽=b%nY-kD$Hf,*>T/g+v6!bKVKNq݀0{^ i  up@ (P_Imi;dzpcN_^vާOCZ15Q`(,mӅ i/r ,ZcJzJ9E^'>^/$]=Q_ol_Uޅ ,ݰ֑^oo]xJRYF}5|[̅/n..`ྱ@8N!eݰ^۳T[s_ںΦ^Œv[ [5?¶v0ֹ-ou&)oo}ٵc$E*梭},ZI(*ZUIΎ.' 5%9O ޱJGϜQrFi_-=[mT8hw/)ڨ\liy煳bĭleB !\1>!},%2xJr_qh]yseŖ;+|aJ#+!aΰTa C"WBVF|goH=Zq꘸pQ !5·3{ $[]ideiJEBJ8 uzAhUl嬱e\蚾y xXV,mA?OWCTd2/JP)k_*बjO2 TyC'}F>ЋzV&jwQjpb}e+54;G+p4ڭEy۟ޞ W_ǯޥ'c,|Z6'?_|2iegN6Vƶ:%rlO2d |X*hQ>}q Yΐja)Wz TO̝$")&T9C3dO@[]é'kER2jt}6s싃|psffi¡S`'!`ݓTf;|3Z7ceRh+ q<1iL3_ }V cuH39RAt ̡#3!3Ԯߔ-ڱQ1Q;<h2)C@ܧ^ҧh{i\Q4 ;7h_#V2*GTX*`"U |rxtΕk<A-' |ax:B;\naj:BXRV:70~sRBږ,/>&j ie)Sv+sLԎ1h 63R;.S't cuvh%Javڵ霵;tjjt&cV#BL2[| Hݞ;VG'X&vH7i!v*GJL2[oE&vNXG˘GBZ `,}v Lb[βTsStL)_];Y)QNXs4.WsT ?|Xn' B?p};eiC{_tΤڝm[zCE?O$%x{f ~<>pnsiQ:$jUe̬9Um@.L,鄛Rq?S9۞CRYRk6]£38Dв,^sp=N|QRZ] VВ7h,|hA}iF#A5y~]FCޓ [+R=#Li˳5 |8{L8ҍA$w5Uq} 8wy,O?_էUkyi|s2MU=/(EM<05m0^^2˛ y"E=M-42ꦿs_n^KQh*|P(ב3/k[< d+Uk6&^8k3g9C4;^:@If{C^'Z!*%Z4Rl>zbfMfs3pqU~k\ ' ̵;lSj]aLLȵmFB%Ӎwo[WuRsbrNvMƤ|o}bteO?q1> ;ҩSSTW s?1AbLyH8 ԗqeDri~*V8:|/;ՓNstE"u_=d]aOmJaQ*C],>v7Rm?+r;,jQ[x9'^EY"Fk 05eD^P! %bw uy]vOjf{jcw û"O? 6~haa4ZmOi) &1T3U7\#P`BE=*_>"1\,UJKCK;n;h|I$8%g.vSГ+Ymϖ<~4m-hSsܗ+zT8Y#IRdNqiY$1 rSbr4Fg/ޖ5kWyΤ'+o+?*h~ʑ82ȔCz0yu܎ӻ7f ofW q25.ʫ>|U fg.;~/!Ye/<^^&^}g_ܜ}>F=XO:>o6inJt^p݇KKԥE1l@?145UƇ5ti2qz"td3>|=2U*GJɁ\ߊy4:r.vN(gϸH^Qyj^-Ւ+֣l&;LtЕCٸ\g=)5/(׬ǾU2au7GJܥ/Xg5PT g[CI&\{tENЗR5 ̕&k&4^O~1\_-@3?lt~|*pg7\ĪBՈY |u}µ`oknʯJ.ipZV?B6ϫcq-mv[ 51@^|ix{ǒC} }WqIB_+p`1j!K9͡rSh 7TNSmtFŴAJa XV Y~VozQL:ȨL罈aTDvMZqg oE&N<.l(]FNbsTes&ˡX!o_?b&j%n)ue?2ؿ65;m mד-׽̎& ~KG3oɟWb0YDp:`kQގqՃ*jBC44!ak@kQ * d aFO#CUZאN"B_#ټ) "?/}`x"y[^n/f2zxXy=C;Be2ju6j"ZشLPBEE#J1BpXYA:Xr.jl}_:u(jHٵuPR# "9 ZslLFUj4%rʒH'RKk>=z)PZV:u}˭6N;!ġƱuda 28bSnZFIj¨V:O0E*/HPJSl ƗzYGߗH/n:?KcK0uL*D08 L]J[yhCdt$//PҠ Uw`~=T`0 XfXj ֡يPWZWQ&Ȫ 5 X0Y:?S2LiQu}Q/aDȮ|%T!0^RL1gR((E,A9JUtiuE5qMUGYg^}`GDC@^$,Mj'с\%i8jS!UAy/7Ee+b ;[G/qE{{BKHŲTP q<OKwM ؂oP\fu3~xjf vUM'zs..R]8HqtьO$9YwP7Ƨd$TY"P`hfes I0'TsB(F݁S;VGD x$,ej!2j:}s"[Xg*n ia3;pfjF#`v*R` e]fvˡ"0~d~O>~O-VR~/16}Xqm`&TATْB w͊9fxjic̳`ІyJrsxã猵 3jSBݚʨηejwl.Bh:&j@0O5 GHdG蓧 ~6LdGKꐲD*hpʵA}Ψ;&jߤU%I"Ѻ*j 1/`EBV]v#b2Y*ȇ~Y`sqA7_&8/g93_QYjْ[N1||Tu`{,x<;c=40ATv]Lhל0~xEuoBPo-zB[L$jk΀c'^Mho>W H;{HsvHgC6iפY@>&l!v9L,hpxmCb<LTˎ93PM%JήxŻӐ3&V1k{>K-]}u/)yrFmYt^yu~[}/^Ԉx΂n޿x_Foz߼Co#캢&Щ\􃁌#1|rfwu> YmvP _fײԫ*ҹPBLل"98z1hΆA/kxpYb-͟_Xs+Ko+`חޟ/d;yԁF~~>gȄ$cT,DЭb!{ `lNc:֚pD5e-{\]ϋS^++U=W4N0a*. aB"~ZmyRIwd` ¾:X*mʧDEu'ɲ B^ jG"QCC{-?03"t:AłZ,h߂.MS3=|[OW9O隴{8MSso{HsVQAV[9Q&Ziإ]mc)I;g (p۔]Sv JSsiAPMC9lkn9+y+;5#?%e:b8?yQU_vMmZy_vϩmU*{4lLs=@[ߣk4cHihPs{tٞ{:bdW'3=ݔK(;ؒ9^a9RTВ$MNHmoV?nQ] ɣn>/E}>1{o˰_ÿ kf`,$6s:V?8>Ǜ/tp˧y d<M3xr&MݛcľÞK'Ėqp۵U4Fhy8l#F{OL<:24d39cBGQsmVhNc'soZ"αK;5cJQ 0cǺkCU=H;5Gh9֫44&3zӊd6i7o I#vjF'DtĶ8O;Ui,:1y.KLDJArT{>Fbp)ed$R#{.Z|T=D}uh F:f aA~ rrRAΈ7ZA-{d4| /8;a~kJMښ>FdD?dg:l7=;6챨,0INr?<_V[BVj#zeЍ9>疾S 29. )s$-G[b認AIFpLBQm aSԑSP;7R3fXw!o}Ģ}X' # J^Y+Y3 &BrVM~6G*^fнM tO28 HSr (@) Cunq|j\Y3v=\>y>?b:FSOk4CWI9u1+\YNB˃L_/xd$,H}HkBE$|ULveQlSL3 +B׊// NK~؏ۧ% "!U C7}SUT)[1m&9)"= XCX{ 'kV*R )UEWO[VQzhZѫ1s:*nS: 냁 eU):bWvZRCD`VZ,w$UUU$S mJ$C "|CoWV=5&C9ezѭw^W0醚twSABRVHx*/V 'H9gi.λ%TtY"ީC} ެ{jLFs*H{oi%>k~Z:6X+Q}CU)"v^_7}ZeOZ?fZ<1gCK Jnϛ~_|OS}x廬󵧞}yyz~scק0v~~sPb|CZ8Gm1"6NUHShI~~8_Sv– ',S6NWckXW(.w9_!ЫRE W D]\䆁tYR`,op[Z᢭{&9UᝎkЧyaK;~7,ؼPwKAۥBrۣˇ$Fw͙M} ~f~mnFM,W/+q; 2.&ʂvk6W=[`˽ __m9zLhHc]73 XՕV]hq~&#']9Os ǘOE^2ʷdࢧ2z bK!h:HB\3^Gj@]PB~MT*+-I/9Ş̷!\oo -2H,A%Mvݎ-8rM~4MVN/듺-W5./\RzJ% Γ glsH8tr Nз$[a$nZw7;.R\Nv{r;ɡ f܊@z¶OH#3# -h}Akܒ95HGlvIo9PNƇ$Nֿ}q<)8ZBSq# to|NgKMgSu1,km@D۳. y Id)7! cKJ=%rj%ےS@DK_ٶsqbD۶3lwOf>wjgalhKL(Z!:`EO2bF6.˖&逍N63:̌\mΊ:wxbnk3 a 䳠z]u]&a$4c ;Ͻ{hǮ չG[=ZE#g0-!dPHB7ih !܉(<Ȉ.)Eo ICvj ^ ́⽸s@%5i7o.28qZ?R]ag񭎪NOSW}_;eZ6ź/r+sq+knGE-ʸGCGwb+ .u˖G uPe,R/yȜ_Cx5Ө_BQ|,QgR\%ϴ~eT|\ƾ~CթϛWϯdT՗ӆ?x oM,z]W0"!*%91'(B>Zmbb*VN8[[r\(F ~\>yymnU1Q.oܑ/ f|p}/Lx ~yq`6ˇ&$h ?`|N9Yk N6X>ݿ,H,L|)炰y&l7% f/ύRn\U0r>A)%>?Bώ?xӜJxvuYv\4ʎW>;^ ;i^4h.fK ̵9vllEA;'HpJ) [.:>{:Ͻy[w[zi~./om>)N"RlK`[ܯv"D6d!NSoyT|%#_/n_]ޭHqқ-=tѶPC(E0\ޅ%$7Ewf爍VcQ,Ոn܌Tɑ2&DW NvӞX޸T2ˤm).7^(๬=_=~ 3)4ɗ$d(xs<vX % ^+qY.𡽧 O Uy~6BV8j ۀHDɛJDX0$)MR"±]EcRe /q|'r񩤏ȣ9,)܆c y3 oȹ}sWs8(#D+}u0Y9?s%BhsXaKURRN{=s,Mv&;wVc|b'^,c>ǎR"WDbG)1ƢuslG<(0uycJ )HQ^QAǾC_Ū{;d{Qj,}zItyI%)jӘJpv]΄ҔRgE~է}=;oUK˄D\/g_vG3'^>ӊ&[)e4tEj2^R <F(LKQX MF:h]m*!^}D(nrGYvMӾt Ŀ{23{( 3U a6j!XPƷqRzT`N+&CPѭ|4 qVG Y!v4 <~:%旙sq7h2˴x>q'}Kw=x9{6Zv>O֭>IlwKһ>eɟ u;OO k/pz:}݆q &ΧV6U婕lܻ~K`b%Uh4jB9|gjx_Ƶ4B}pdQg f}xS&0RrǰHYQ3S0pw1 `n8`d4I-gj2$oG[7\ twD3W5sWRebzw?:8={}f$답&RJ'#G@E:MG:)QFZoZ/<=|h]+ꭴZicÍN&5YڸAb[D$:In a'P±)}rW/em9#騌/|9/OR-qw6 1)j)\Td hЁ4V0np9l#lƷsI %s6/8?N&pQĭiʔLm? a)# 22}$UVfFL6Wvm?Ym fIqķ@&plc9dd*o2Ds=۩8HI9#2a!qn1rBaީc&8Mbs80MK0GȎSsb F"q:QĨ`| qXΨImÐ`V8&IqMrI*ѠmH´3))Bba XkeCA N% Ī$qba?|g29@H dD#RɝO!6Ku"I&/F:oy*1ZniW!}X=ƱTdZf"fߩ4>uį=*qٲO ƻ⣖vDE 5ӭb-)-oKqaLhk]yRhN1cfZ.x#LXmEH N+)df25zNf6>:+W*o9GGR]=}0IRA|>|m"YNE1-yd@)6GM[wLuloM HձKy/uHht o6ʘCBJ#>"*` e4CB8R£,ɊT ;b1roJ|t,XV$5>w1XnA1m50|/ L7y׈-v8(44fF4x?C߽*Y _Fxvflԇdkm." fy}߽&%jA1+8Y"t #on(0bP VdH`Iړ66a6d,C2&2FX=Șℶj%mޑ \5wK( uEܙkaM'8םlҊ|{ !J.v(jbɰ?]O?:';<'NQP-oj0:z:\F3nas&V]eiSໟu e-FMX0~[WD,yA`YʉJ6&Ѹ@[c9ը;3- d =[Ā\(]mdh٩ks{"Qkg!+= S 40hȺ!t]=itNp3yl[j=iP1 w2-B/V⛼@ҲHuƚ$G.:ƟÆ Gmbsו ~Lf}ƪ)Yq#r[]YPۆfVh=xբOS_/AG]ҧKB@$195&-5~>FUW?4Ǥ^uϽttm!t00Gl^? n>:˧R*$O5؅8HeˢSsI(IrTR8QLX"1_  ´Hrj |*پ=q?|(-yWKS'L dS҄JrMȞE˕Ģ>Z*ŷzu8G1Oh>{ȧmuK?xZn|CW2|W  HVg8p(!-/*o5s÷\p{a7I/T+*q2Q_P庅*'\1VrJ0UKY4duװ(iiTB,-kh*j`R} 5h`aFZ!!tK54Y3 5%ǢA44 !Q d+<%~ľm:z0}6#;Y&M>5F ՙ,)< 'sQ&J2M"~;7MjO]-1#&v DN& -?`љXc;?[~>#`U~N4!%1'U#A&3b[ dnbuS>TDGן(: CYi ~As] ~SNE1 c)2,Mkm׶ȇgΉ\ ә}bZE58vҴEݔ_ eI tbȵ6ɵHnpgO+ 0">p̛5=}17( e4,nr5Z F' C<;&؛af#,'07U/qsK`Y{\UuY"$GJse;`tPg>Γr' ĨɿxהP kǘghҔbBsL7qh+ .,xcvRrtAڀj:.TaYab-EЍy,DP'#0)F b+P̱z }㝋{,s=4`^A1v4SWpx^ACu#+M(qWد0ʈE\` VZ96^E !<M1BRZaoLu/߱(8-l4S(:N>h,"8~2Vjg@ bo@S.&œxV{Tȿ?&q]/jg\`PA}ݕ:o̕)B"&c9Rls>Z6Zv0$AsMB$! 9bDGaHaI̴ U0x2ZkY4έiTŠJ( TQ %U`I}Zrm5a|L;-upTJ:oA5F'GqP;vl3㉝+-Ӷe3O7m.8~g]ol|NądEn?ƪTtexٽ-ИH]!p怙Յj-L|@QhDq~gז?8ҝtwvOGǝr;FNAqkq}28y2gWL3vIy~}ۯF@iw9~5M>永 _o*G8н҉{ܙsY]~E;Y|KBvY/^ήoYT! 0_>v~s' {c{q΂7Qzih0 ?Nۛ}{v;L^[s=woRLqzqQK}Vc!$FWZMFKr4?~OEv<;(Wҫdq,g]u~b;d]0M;C_ z]) Ի@} lV.|}hj(>38>vzhūQ6~ۿ}9tg?;!=`w !wg6Xx >u66a/ߙuwu4?R~qNhP.N}:>] FkߎL y`C/0dMڕo`,㗰׋BsvbqS+ޗ-|M7o1-sWz6b_N˥-< "1q0yPѥ { F"^R,EF A08ж5vZ{'xpI,8D@;wyd((r^ }LEqk|t"n9sRd ?u'+Xߧyᄵ1Tw? ]|ao|=25//.$&,VAHb1w 7j~UGBI*5ؓNϓw_f!old$wPwK57ld}_nd" &;APD7( ~$Q.t#% BKH*;B>l?:N%\ߟ| s7- {̈́UvГnf0@+VQ0f՝r`b7o@I$.noqe{hX_=՚NATΦ5=I?I/$4Ca9#=:6ĄT0C.f,Yp0hr"Tw]ӮiӮP$(LpCI!U(F!_( 0ONu}P)( :1&Jx%C8VYğh2Ѧ?7&\d%BnB7A<A/ܾ1uɅ`E3̖Nng25pU^7G^LTfsDG\R51.T\i˕)CPI]aH$mH&˕țb0 LCF&J" 2G+1fYb!OiÐ!bw z(O ?.S?X6`S EwJpJJ4e> u•aQa ^,c<{K{KA-@(iH),3&PǜcPm$1ÌQh@4%1/eߝ˜MkeWh.g0"`|MT, E!KwK4FF&R9+NlZՔQaTc(aT)D^mPp41ULXb#P"5LaRbŐ$tXY0?O5Th+rQq`QQ9[\KԬ--A=͟%:R;S2\@ke1.@2Il%MpbjCCd})`Y$5+H^]xwt LxK;-{ۅnT rc (bѐrii$%3L(E(=($a Kt@J3±TPZGsG$Q,J@sah42"oFά01++CQwb`*q *,L, [A<[Sa~OaUn{5t'U Ee[j<]ƓRT`dUuee+9\*Zf?,D3Zk8ly/g|>w8oeg}Z_'uM4y9'w1S>/pFNg%<woa*_ O`ҡֻώ'S*sO &/^82|3b߇ku&`a.~ݺ:Q%JSlmK{ppO1)N uخ"|YQ*rYK;lRsZ3Նv"$@AZIwլY1L rW9U&hUAp?)@DnGb'T{ ;-wcN*FuN7NtY0 e˕(0ѡE5~$ b\{b`TszDI}n=nw3B?Z8z[Lr[X;l֠o[i"lsC0"*Dki ش%/^٘u~^ n,tgb;\YG(,CIG+{:zƂ-ݏ}۲NȽJ%kXwJQ" /IC\d2y0ri[q0]P>wRM*5kѻ7KWvb~pbYx9G~:p Ü}vkd0j_xF2`{]zdz]-?j]ϵMwpW˻0IW{/Ok]V ;_M#ro 9ƨ6ӭKvfc?{^l3{kw~KLoHNGڪMvIn5` " '0\·lnyƏ-n[)6e, aoݱv0| <4; zJf+_+r#?,_+ < 5km^BݵXy;ǻ@\BE^7Nս)X]5TTYUTg.hDj[A%,Ҭg9opsj!J`&YD5YQE f1#^8J<V~|{!Y.綵7tms^V0ǍHB8\n?_D; FαWK -#$W$_3ኢ(%wUuuUOW3EWdqdr N9j$tR J9Gfe'ÉSvݒ5?U9tQ_:Y_q(ʩ:k`T_ R~H#6_ZLC:3.wMR#pZk㍕.5;kbu)ϳ`F .3G撴wl fTcϞGCw9eZQ.~q >C,yzF aY$ ON{R"?`:0g5ރ mL ҈8L,ū4 ƫ62ٮ\yOh=lBl2#ӁeeRȥ+P6Y(ٜ <!{L4cAPaƤ*&4JU(c٩hBlGҸgM'j1$YriV)4&,f=%C P->D Hȟ5DT dr,Dz\NsT$0}|Ufj^>cd.z G !S+ntv}4Vb3u[XⶖdԔxr:!kZa+]#P[Br'dkQ.r"Gw͎z=HnKbBnXDX|t1%W J+T3vCSqMΩbc5 C놬W }0 UKA7p.Ѻ|D3A@*5k)2'$FD:"W g46*Xifh4A&ASV^VA_jݭy>wQ~@L"L8!5 Ob[lL f\`hq&{Auo.Q~>Z+tr$bs> DRZ3ٗTqm;DhΙƬ{؜P,_%-( j r9'(#\ҹnnhpFk.y+g{U\]ѝf@١=c.e~)U电?_oz2W{6@IYuN/ N9l|lXu LYqiQĞQD3Uu,y*.&My&!U$%\хVX9hL|tqP {r9bI0*9%$M"ذQMtg /_/goiAn䭾}op֐[fG]r]ӡmVe+'m0 LcÒu&sQipPUջ|>Gz^uX`r#̸tq9c=`XG9[wDDZ\( OϹצȝ]R܍;H0moЫF3: p9#ۓÁ#ˍ|@LQ͜F2c]1E|$ tmBm`{''ޑdհ34Rl=6 5g`%ڭۋ ĠTj>PCmF ;ow5%CX)uwH $kEGXy!E0108 њFO[?Q*{g#09çϚ;`DHNe''z: 4|н)Вpt'lگڂWQmuzFBPs#h\3\`Bg|GFT3xq)\Įh"%>Fn:6QFd҂<.ih`kR4kUWqj>یgBf()M3:$xbD Zsak*ؖ2t0ADk8aJ.*3lFG'hO92*ڠ- =R|9~v+XZn6 $2 $s.PIAR{âh"m]Fj򁤶h6ȌavJN0"V0#1>zM Jʼn6Sy6Sa;5AZ ֫|GvͅQ12YXgO8*݈'G)| {L1@7 CD,.:E1R`}7*Aõfu#qT`tP?is9e. G7x2aT%Z\JSo)vs&w_v@zD,;'o w7ƎQ~L/(hH&5qiyVp##Io(h E Q.8[w{1 qG':,kѠ7+Wg w˸5 oݐUpJ. uY֋i\73Kѳ g\o2ÝD)<({q_ǩ=T'Lm0 fFТ.~?z8_]j:Y7wA] o?%CKݮy~ ;xM/;]iGȃmM[m;/Å*\kK[T7yENߪc_pe=yi ,A%x4SUR ӊRcQr! $C_Ch#$ԭ+_bW/Ԭ\ݢٿ'.b~09=Dv̔R>E53miD^Dp5!t$mΚᓘHdg,FaŽ6*kM2'dAzҩєm5KNC5| >CF5y\L<&D megLoy#uPZUi44gzR2-f>$g ĜG&G>KM }5hMk@Y=k 8 \Wt+HuZ+v%D'd M!i,bB-:{4\Mv78:|4@7md|5 |¤ %4u%&:HD3Bck ŇZ@\Y@YѓlLަ-`D Ё^'2GRuN2o$FaP6[@U(\z- BY;&"xvNc=xCAP̀aD\6Jz UDc2@IC0+^MMz6pZ TkM C1@vZy812nA%٧a dAE\#6+y&#=1r` @_&8(ä:1d"raB"ByA`_u= ÞmfL#ݼ̀:d{mFc.%}jt_*CI4c3XH 0&! 8NzDV#(SFʐޣi^+zF0z ^{ .@O ރBCSDUJ=LZJ`BK2]@gؓ2D3o:#5ڌG%!ڐhHO`UcXiaⴱeiVc D\@P[ e<`Mi@d\(8.MK 3Ђ%fbWu<ob T<B9;L6P`E2 @.DMSeXF,00󬞅;rl"D8yg 5+!m r+ *"1i,e"Yg:(a3>YGӼxY 3.[o Fse؈~!| <[Ż=Ӣ~AIq.n<,롨מ"†r00;C3,z괚!)bHY#Gp4xeh313Cϓp|YiF٥fEĠ0(!A/{ s˩4hgȍ[x(":Y%\͝1}E:8Ёj* s+G0P}f=ìh*'{[WHԺ)a :;P&a| #:g."Հ C /|SYT1T:1tTiĀY/`-\R\RS\m~L/>rw>+Fn14@ZKf=ԭiW+m&2j BV  pKvڳ[yd$9}`jܘR_eU\X36 6րP~:]YDt\UyzY(rUXq=ȸV,XrUWڗ{VnVBjvrIo//O]~Z nڰrꬰ~XlxtueKW%olkgFj/k$ؿ< [ۛ={k[{NoEwcE,-}tw]}^WuA}]9bŻ}ÿիjѮWߞڹObqʴW{o*`J饺 .Ѝi"wYSPrQ ˝.=Ɲ^ݹ.z=ܬgo.]zlnp_X‚␅E՛LcH[J ht E2Z*TL]!ϣjT& qV]T}Cks إG3]|7 v7=ێxL`@g~>yQXdtt[xYDhL][.i:ǭ,~@p@y._u \~_pբ7zÇx/c E_|7'n468ĭɖ=TE)#ঐZFɺȱ/_S}{Tjfʔ!z/9ib, .Qo 7z3T4>ԇEÓM]\74[N]X 61ۛ E7Z9dNZF\3ٚgl~uiqsZJj].CNٓD#'pZÒVQO޿ՖBho6HŁc֚z,󏿜|sOOtwg/#<8}a}vv}g'?IWGN û [HxPz0m_$mۀ Bf`2?+4д'D6j߿p8[dt_ÿ]<{Kܤ&}U*&ңbDbXHԗk8~@]4ȝ3…8" Kb*z[`$+IpyCؘjw>\.W]/n#帝HǺ#Hyzm鵃 ldLԄ2?bEK5 J-R[ 6ӣq_c[/V[A0_'-5wd"qmzq&/J D '͠F'ϣK ɱϣ)GVhqaZKgiĐ^,"*@ۆݎ s2Vu48ZG9AwN~djסCP=4uѵ q),&YnziS?7ȖЛ)A*>Ŭo3Ƹ <#,uxf޼԰U=Sv6wnKtg"|%;kgsźg7Ypxgȹ*!/wm#IE0$]D.aw33v79 Ţ?c"${gՐ([#rr E85UUt=g~A-LCd=-mw:| v +,n;PvCc×yd󫐄nl7OvdPз!0,'c<@y[z٫rCn߿z?i>-0,fp(L @F&+ CFd~9wey痧p٣|C׫p!,U<࿾ҷKHo29mK{عn&۰ ˋAh 3wWۓz 鲠v!i˜} W+VRjQ̯g1^~޴˅~Ѓ P1bSC]xUd쪦gLpOC[u4]F}9**Y l *%e)[Ӭd3 +&_VXHǘh_o̕pJEֽ3-Ȓᮔ("&*iP73Wل.E|nr~lRv*)VC&xָRfԬʪb!vǜMN)mYkRgheLEfKHD (YWV' Ea&TLf\T &%P@es)& W> sma5)NWK}5ORXCAVłxqiU(l ;&&iukeLzᓧYJ%ǔi "yx5jIR *KҎ[W!YwOO P6::"ET@E F(~(Q7 :P#C3%@IEDaWt.*H-m< k8 rD EVNu+$|kp)-LE>N O-KBqPtRO]Kr(}FҚw-#HQc ^Yn\HZ4>NBe}t$Q0eKes<}*J)mU eہj|h"YEM ksƀiQr% V͂ݹ(E)`šL$M+2yTmbU:L>HXPFC>[=+Q`Ť j e<6,EQ?kdH:`RaqT)A=j,YRb2EO>R4ap1HN~ 2fUYd+xY@@IStK@J-ѕV5b"coCEQ jXo gRdּu#$ !ح"sTPƩC[ 4丱RI/ZDQ8m]y\˓ܴGHmo;'jZPVEӲeuuYvLj@SG:˔Gb Is֢`4ց ,J tn]-*fVlQQd|.bnJcJ^)7SP*Qb5Ra!mݛ@$GGj Sbz=T7c%h'Xn*LCͤYHv9 gF{]M1nGai~T-k|4ОXr@ 8 ƂI)|¡d)-H*^#T{aFV l~r&2z O5 JÅ@aS|V'"u:E贔GAHƈӌ@Y-19Nՠ Fkuڄ3xgš4y`z1!/EI7 48(QYİÉ5!`3H̰gy8fpi͔":Nّ-G T90Ho,z78pY="/`[fz` #3FؘS'kEp j .#*Ar}IF0OQ)2;714t *zXS 'pH^ր\Ÿ3jP*v5k$@d2gc1,LOƝGUD|[QQ'UB%-@ DEl\j%?(co8OY2k% D7G J.E72pXSmX@ˠ&{P P` MyE%taeRMiX8(kLȯGWTxrtqyvӸm͵L"H0S| 4#UjXS\CژLG _^c >;YDWm* -ǂjEhY-:`4xDp312ko7M 3B2|K 9)1³'Jt#`bsԤ].j0UFAuF!K=zB^j f=#iafP>,4؊"D,R)Pi*%rI{a^|dZ1T:Umf9Z:p@Q Cd.&UTT lU*c&Ǒ1\KzҮT*|$+?!W0%LB7 }St4Y}~{P&}1M1BP{))x!P 6J <n1=yj1?r;L .Lz}w۶\r/[|y 0{kCo4v`gX]G;\-d^/wiEde֗ҘF.W+?O:+oxKaPwٻcW WnQbsl}Չw2H%gBYٿ\E|-Avȇ= )&#ct8ҹ \†#B'r^( e|2 _W_E}EI"3ER!YD߆s/ 7ԭ>69A>rD c=& Ez]ǎ ҄tājeoK:|nLz#~G>usX0}a\x&WbobPHQJ:X3/2IZ|ws"WRCmgB:saܱQt>??-~^$yxq~2V>m=?&nͶj7<$&%_6ɠ[z^`(ENHlJYor1(vE1vhbm=^\]_?Meʭ x~_O[N]vw}>m Lw}K~N9<8j~؂ :yY ؀f}3־kߌK܌5lP±RB0Ӎkdsq#r9ٙ#%F<~X(ޑMN* 12>X^{&"dt5JKBSmܨHee USͦ(5:s>W@4 3e'ӑA0˜~c*<_\.1&Ofw/xʹoϟMR_^lXI;Rڹ>tl)m̃ x >zS^@{S/^ zAW/]t^ zAW/]t^ zAW/]t^ zAW/]t^ zAW/]<6wx 7gVB[{ {kSoY=6gV/x?7אv+ ,aj =2tW +dG%?[|9??r/xϚՂhnSx3QFC Z Oyf83;w,֮?es@m!6BN'x_HN֤4ra۔d)JM)XR=3)p,):'( ּsnƉbqҶy%eZjDF3ΐaL+)\Rcև(Fjqz$gـS%kZ m/_}͖O 7{9]J,r]K@(ݤ-kJtp;$%b鷙#{ީ`4bQ2i~!b QPR L(aJ[5N@)9O98P $B\Hv#yZ[35os="cj8)ZkA1'YJpQl15YCxkCJ:|& av팩nmvlF $fQ]V998Jo"+$ ~V3MhnVkDԹ+f\߄(lM;/ t^b 䐈v^-x"y% XnD`7iAT{N7&hwx X=TXETD⒳m2l`*2 ܎ll^_Mn~oܖ_&οGM5(~?Yk?ZAiX_fa($sWlc=AܠB¾"NjDn  ±5p*p>`2'n)ڦ.Crh4X RHN^/;fB}9KE>O ndS-rP)㥟jB.tS BOOW&]z:$w͑OZ![i/IW#69]R$~Up]$,߽ *2k~^_ p|kouJVk X߻{[8y0g 0v=\װqgoȰYF`qUP"L.x+ p@qk'Tx5"Sg:. &Qj\s]ũB QxBkw8~^V\>RT| zVY]^e VWi,i6nwB .1()5`e*e^pcAkK,5\+ Ir]qt_U#6WTH!1o|E!>fpա&'́q,(s4EZ#LA0UmHcRJT1ΔDәRLDS)$vV+`7,8ɓ&?qPT0Ih*O_]6G5Bkz;' #%$c1``2*N&\ysǀm<-܍L]qe1p?qx?/߼}ӿ~Џ@r6s#Čnyˇ^<5I|lj)h_&l'_~?07N=R^| >zkQF @o7I°])ďt2_+*Wbnj{^ [pgdivt`T^/ LJh&OpGf\6cB htʨB/%1gN#v)a{ep&,Ka5 ܰlMw~u-_Jb[TͳmVau9D-l* R ҮkS'N.-MqbB|ݐi"x%L|6}[zN@uɲ;'!)5%W=ih+L֭Hg G`Lч~ på?σ4T"Vie.d~3`YԕɄ& Jbɥ 2RSR_Њz:Q+)|H+$ߠW^#p2X+BG5鑒\̪㈣RئpSkR)u2o)u޲y .U-˅/O#;7v. _UV|ք\%ސoͯH@JQvX=t~ 4#"#umU-yyMrX=t VQQI ^qGPe泡`*>haX"OsXy/MFI ~VkR1ww7f|GWDI8y :[X37b_tRq '}B(eFkd jYs'f1l `3Z[d1?\`bp~&*(Kd><累Hpa0pjwbM.}u 4!X^Z`'ўZo uƒ*}TQEcY !n av%WnJ8O|D( (UX|?~N<ђAPeuҁ:\pCkwd}L(R'`p/ .P va x 8(U8?l27f"4 ̅|a*\ I뚕 W%g:pd{Io|P0IDa|yDhŔzY}Vgf1,;9d8̾y)IK4N݃aw. L\:w>: G苀Č`8$dX(@!}LH,^AWPePMMU?G+xUUQ $Xō(%ZݗY})<WB(mo|8%BFtt 𫣧Dō_ k l"`[lhX( {oәk…i@#T$'iJ/GRctUA\C(PoA a\.&]П1 :D 1#y-OO PREj6BJ$" K[`l>#q j.5@fHPux$Sq>Ět,ũǽr'Rk.#|UMOuq*/Z۽R[ $zCIQ%s=n W 2" t [uGmەMߡJ>T8na&ׂSPT[; f^^ le \uT4u0+tbL':à}u'J22c<{/Rĉi4K%˲sf3/2 i &^)Q%PũrFUHy'La~p4 ǥ-]40'?< mPPdz]ɪŨwUΉV{՘\ejߊbr5'(65yru,t).k3]-BJ9$Zcr$[ XQ5}Iyw6Viv!iN0c.p)7<,7"I+O75`Qf/D$waǣ**QWpZ/|KSjJb5MUjS\di')%sLpd 4#,7`9%)fׂZZ-/vu؇Z1#~m7W;w&:a(`PavOdwW3>c˳zLA[S3dV3Wڬc/.$<*uZE] ޥSIhgN%p$xv5mYϾΛ`@_ŭV~^4tdcф!=:KAHX$G ूu'*s,unk"(f@*lK|jH Z%-DrwON2R=G#r֙;׫qVi8,.Ӓ?k`/.Yʫ0b S@JB] bI8Bhw_|)! EiMِ1l糁χ)]xCks*Ws{Ff T,<ʎbywê%S,߉KH?_^'Lxs~;% ێqmugvTZ*-7wuk+Cq;FFcᆭj:] ۪9;WGS'P?j :-D}.GQ ǤG/dM& AÜk7 Tl=J~P:1= SFF1INH*=ٯ|n/:P8@:pX\lwe7 &$؁ [!yv;Qyh[Ya~Ҟ ^=`Q8z#F8WLژ-:Pº@>Wqw]!Y ˹σb* , g5ͷqDҁp$5/X,<@=g&NtV(@VQ` 8rZCP"m}6v"̓n8x{^6cl:=1yA N{J2KOfq4ZtzޏhCu']mwPJI#"U{z,ҾJi< `95IB\$gk'tA5=Ǣvq2qlƃIsf~*N>'  <Zg$/`mo]@" ^j Wh~>SnX$l CݰTh,1`[uR*pyRn=O}#b`]=.8\V]Cr 3E*rh숓dǽ܁RxSӈ7Ccs QꏌQiJLj |.nM`!.t)dF%Qr{*Ypt=ȧiE{4؃HD^¦叏f/Oѝa0!H0OYo7-sI&?g+/x$Ak8ۃsJ씖uKL%E+,^Mq$J12 YD4&L!:JPikA=Մ3[RSTi)7t)@6YlJbbBSF4'(&<@5ՈF VQ$8qf I5 ,xۚaD,@X']!)7"qlR"2FT YUV1@D1LYL"( #CfbR 8a="y_)0|V֫`vy|jO2, ]`Y8E<ȺVA~oܻr.1Vy=num)*JAiҁ MTzq])hπ"Tޡrk|Efi3ߔ.^hH&#f&#YKx5ͽŚ!iw fZ25|S_Ō _(_NyD3NXkuJY!To0yh?8P5qw<|JYFhZ6_Xa5M^~q t@ga>1оX {3P2aN=.cEL˞\t^ՋU?E!ڗ@k@*T刂l3}V̕U\%S{8ɧҺ<8`sF4z-F|V\puLٺs(l?r$$DfcKw#XӋ,e(LXN_;w~qD y!2L 5)QH8d 7}3_jǜcB7SoK Hİjۗe^cB؟y'J_Ya&+CIStnSb7Y&Q`C GL7 [s#v!;S=v^TS[z۝}d<&&M =!Qa^lMqˢzB[,e8nO7gHq).LsH[{®V0{%:P,}Z6ָ$y.V8b$blkA\~;IqX5]<}*N?` BAd(og$] 9Ǒ=j=$ ;޾ǎ##n)/cӚhN0/KDӝpA8'skL߇z ~w[h$ŏu!b#D04aDRFc9#h;Zq">Tl`GQ,^]CB2#}* `(3̵C['|a:i>}/U+$o &JK~ەfC!cgaaitQR5(}TFH3iT! .c[+}Rˆ.@Vku1X|XWQpS-w]paa5`PTλ.< E![caT_ʱ=0(ѤUO/oO_y`{6&՞&`1V#{B0)#^_ ZcٽX|dt(5b`mIa} E$fNjr)  ]QhLV} Hyڎ$[- hxys[.`) ^9$")֕ߝt+V?/7Wl5OVl'{? !Pu[kZ%OR߽~ݍrI f` ¿b۝l]Gl,[(O-&gdnFVF&y\ݖZvt@D6YS&Kb0WqmMZi0xf+@ PF"J(X^I![fU9Rz` g^Ir-,Exssu&(g53&r<ɩbli!SBOXLa!GYFNl#Xn#Kpמ|P\ f"n±hFY X1w Ƥ͍8vbjyPLؠ8 'nH]:~ԞdP uTJv`1Y84o2]CLj|22|v,u 6ME*b.+Oe+@_AsȲ1ZL|` fg>ڭjVZv[jx6T2SF$:(DXr#3cDpyb'D`!-??bPT)fit}ыx /WU[,LyY8=pl9 zݻh)҈s5Rܭv٣kjK}[Fuʨ=Ƒr)p@.fǒiJBC֜5V8d@r-jspUu?=P*1ܖ.znԞeE!]8wF5b4xk5K;dc)"$FDaiGuN@M86qBRΓT1`B(LHfHgHD4*~}ejTú ՈR ,0ͩ$Y2FJǍԃ X.aZd8SHci#xFHH1e4KE V!4CL&ǑQTYP!`ҡꫲ%X p҃|W"0ڱ#dsXO G\u6\zۗ%d~<37=8 6b1,Je;]l,b Ʊ@<ԃbAu_ lLDy*s/alDŽbr9o+8r;?'i?ږ:R,ŝlӴ@* ¨EMRLX#hv7>Mdox{Oͨ\};(h;ϨVbո'PN4!(U"-t`mx>p /}m6Z_ٱ !#$a4v:%6\ _^MY+O\]-^^w&,>wGŬ ?v2tY>tJ Ǩ٧3W !`u"+f%JyĴ U`'R3D}$7Lnmjtmnȇ=lYx>o[iuaT_؂Бdb~1ַe=C*G Lwtl@\v~yޞ.Zժ<gϨ=4 \aX%HD`PO4f*Q3*I$ A 8Eٕ@LKŤOhJP,MjSQQ7"r}j^g~S 05rydx]mgmñx~R]/.-\_|q}U?r{{O~v|n o.S޹?޹?pjIb{1Oy fwo!7,ݴ}9΀x3 B7^3Tl-INR<>/L`^CSp#M Λc/+|ѮAK;g0*4ݔOցQ(yw ,{ۑY,œ"p?i14'̃kWjP%c2p|`$ V?6X ФԤLXb,Ki80JrTHa,PkdkbŸSMY O0Ef oYdXv*C$YQNQ27SHIF­%f#%5noWd(qd( 2**e2, MD1;%EL"IKê H;n(w';r3ABHϣ9c(/ɁVèR=a'=%LXQ Yb%($8j.}ݽ†J4(LnfH]qEV'WPU~joD\Tn&LtQE4ȑ'cQ=%cɬN1JU08-1ua$ײ1J5nK;"xwHf JE밚+鵀YHJk NIQ23 #+=aXih5Fրe-u `SNptQP5'*c)DDžP5w68_$2&^")Aw Uz^&D?"*O`AV'i2 %Pdw76J6م"Dh*B_y'ofw1J83dw7 P|ӌbyeKx$kEwzs+<+'D=d 6kD zE:4fh1YK3@b ol{<͂*ᘲGᡲ9yiٳ5iSYFYYJ FZQ#g$x~&"I׹XI%G1gV-T,25cΖ.n?fǗZ/EG1ߣc&Of{|θĚQtS|z@ #ZB"heݢ<Ѣh qSyVDsKh:᧫kx{dFT3Q !%s#bU(si0=@gL^P AtMwOWW16%4meN}Iu,9;FSY^2OiSc05TJ9d(=,3GyhV"s1 Xx*k pc=0] T6ye0P. YPìwq2Wʞlw|)gcJD-h; k+F?yǬuV| ͬY7c D2?#8>j`Dn?'Ϋ{YKz"ULb_Ȁ$(fynjYθxy2, 7 6-P=8ъ^|VDSVbk?? 9jDeV3E+QP'ynj_xG Aq4Q#ncQT!%}G}nLj鶶x.ݼ~Od)'+ e`ٙSUQH̜*'%P3DeDoT '|GM2O"!s 2=2TáݜMigeӜ-COeP83eBөϱJ2Dq>S "[1RIh3j r!$hVn|fv|//~%Dz} /-fqo1zh%:HVl2>/,z힮~txSl;jbn.Qλ8' {ۧxxS?؁5Zݍ}eL(xBqD'E#{zFeLbDT}4({UW{oH ˛i؇j96}cgΖ_5b)Иzfɣ;ά+Q"~:DG^ iɉ%[. ,r!Ka9_OSH1168>-z%k OQUUճ.WU >ח _ro"ӟ39#e7 ,G1g(2fߦØ!IpK6eɸlE f(Ctյ,TmR9@nxvEتt|(2^ۛ(C(!cI_Vm+Q9L/tY\9-Ɣsb~#z9ҳCkHcE&163=YR^~Ȩ[!qT>;G3:)jtu/#WRD&ϬWqKF YGY6ՒM2.\;poʑ).򁦗Np[%lؖER઴-tQ\t`fTYMFN*XyI_-Ǎe^zyv?qbA]Im\stt.O_Zw4 Ye H:6do^[0?.?@m@V'p.ŵ`miCUd{Q}ulV-?W|f! oz}K\>#Vzג@ 2  &$jdٝ#AMX`jKEALH4˻SbHNLSXTpbN:'Tcw$lp@Te$.].!LHzD](='(T<&W4DLH\}e$H_B|VH ʺK ؋,p$Rh=yP9_7Or6~I )&1e)`Y!{̏gV^BNϬ\y峾^E:Cyd2  at6Q< s2e+‚OɲS$KɲST$fzS])x x69 !d;B@Ȕ}Hfے$&d4> )8Bw-"C1>!|6<$dd"X/W_Cd(gyfKv  #4ɣLghd8č-m* d$hr;SSL '`kvMf &):q^"P3R3L~`]zwKLuXW,l`Zr/`mofދvq?/W;"_X搦cńeL2tWWTJ+H0? <(O c,Mo[\-O}_ÆXtWma_ap<9)f:dlGi핋b}ʅPly8԰l\l~^: 3x[F'p%…jVNOC36"F\89w/uݴq;A;bYqIkZ"jרCp@P$;p7*em?AZ$a |6qXy1V<>*;ÆGq/Xc؁!#@Cr =&vji +]EFqw_;! IdPsYE"Pb>S(a|zw9@@Н i=2 .,z>ܘIwuBZZLCTv Z-˚U('h5 }F-OPJ#=Z(BZ˙R#^b;Z<|C0w `:'A[-?.V7W~V\d҂Zn/.{߯VV׶r_Zu^q/_[]\rӇ \9=_^Ή>Է [#bnao߯;{fwwX!Y]!r>h7INPwGB"$SAsvkig8(x- 9 9)K{_d]j*%wYGab Nk`3h3C[2[b==)I8 jZ{E)皢*8צ4wMDQ\(2^s_0zW` 7Hj"` `X+L;@\H=kS#GEݽȪ"32D=l@j hVwKȘf՝GUVf 8*#4sUߣl_{t:X큇eIF*dϼ 1ʠB$Zdz(M뒢$ӋzIEKxSkwѠ5en7+Ъ98A;$S̔!ye;AY @:,] LC쳊OTl[>xz~~}1֠*Vz"P@y;dY"\nbrO˔%L+c:4;SXJӦ 9*D̔{F I('WC&KDbdYd2 K N/<Ψ΅ʚbf骎ʼn`,]х;OWk4.<7,,}:8/91#firgK~'l4Ջ" q* |mei4vGRQrGE9757EbzқrVnUo|ovƜBZhh0QC=8sLћM/ء.d- xsOf,A2>$TTNu>'1& gvXxI_;&܀Lˢi6ɿ,|1c9j<=~?o?& )Fl*)B˾ T3>تt3A'{x&q'cEW/*yr Q O[pdomo/YYxy}W+E\7`upYRvԟO3yƟ}Lzmh4h;[ڎ]+ʪ0VruHJ{VZUz' Vƙ肌e*IrI$'3_Ƚy<je9WTəIirt9ƚ˫Gz!ìDכP 50߼и`n귾'3`wFF`'r;3z|j~]<3% `# n?=:?86$VXFM[ki5tq"*֗665PNtu/F"|dUV;p퇯a(8Zt,2kOZCL&(E̎Yt[F gL=j4Sc]dDPֳ*}Eb$A@LAMp1!{U0 9R m*zHJĀ$Vp0k˯󷵰]*%V-|햑.tֹU[a=1oKZ~-[-f!qo%H?A[!I moU#k-GN^nnys5YB›Sb*ŐfN5=(i ̨7?4IoiQB y[h+9OzKmZAm]o7^8L[9o0QʎhJ,O(R6`32% *@i^ʜQ ٥LiwXeTxS{間og;jn:Jq}AF_d\)3*k+2sRDmA% t-vANZD%s|1'dOR&Dn"绐d' @ 8qbw*D ORQ!7I~ Ocv<0.q $+$)1nS"ԃJD`2QK5Yg , }beiG7 XJ,]YfR"HJ7ei+f#H\Xol!EИ^Jq摬8A" !`&#5(&ҡVYlEKŘ -eGZ?mɜ$2ںҫZ@Zڨ"KR e5Nu8Y=.EbYi!NI#R;9/nScA C0˃>P[Y׵M'E8gZ%ȵ+$,u%)Xm\+754r͌>IstlY6!Dtˣcth#\R_3OiuN$~xwFͻ4YCfReM&OyeoR+#ƘG0+"h"o(كLRsJԡZB# )a 9E]Dw(6*4FDsEQ2!FPK߻q| ao MeI 8)'2Y4ZfxfɓT`KF.|!Z ٶ6}}xwF;FxJ(I!)A& eD*dZG. xQt9=\j50%>V߶(Nj3fwJ\e^Ā`8"5&91gkE#I?~6z3?hAlk'`ƹ|lx{SӦ*Vbt)'FVGP !YQOVE$gP4>tU^)]z^tE@X|k71_װka*`׸tE?`džmN4ȸ SȻ ߦuK녊eFGeW#RZp(8 Pܡ6["֖1cq0(s3hIcs ia9"E1I PWSZ!9六osȣͭ1)VsaP2-Q2cc GSey'O/O?ݛLGF߻fhstJ1/nf9{WP``͊Z=Ś;RT1VF u(<)};)Ğ5#yŃ YCZ\7.;O. W;~&V)@+ˋ{B-ӶĢ kgEGdaޕ;Jb0X7l|4D =ȝOc?VfGǭm5;ƣN<F0zM'HX+/l%0R7h 62٦#aӗʙ3VlC>j2M&(Gt(1 ]:8N߯ t,»ڞLB-@-}ؚuqj1rVJ*f*Wԇ{,jb&YƲgdn1CT)J'jsUdK"1X+7VPWt^GEp3q,(6 󥛻$7|#a(<./c-<x‚aٺxKӸw:*1}sv5T9n_L욮Q%b]"#e)ZJ#YO%OWzJkjU_z!S5Ҋ!;,> 1*b ]G%XF@mRLJ*V W.J-vUAD`,1 kg2'V{ ),6EE9-J)$5^ uK5 8OHn^ь²y [LLZC|n̅4Lв@c:B*=h &lanW:(msnU{c@PfHl:یJ( 5Α靠6pMfh*Fa }J~m{ c}${}gtW=1_;=1Mצ}\?Qnba|-Wꠇ7&*J{}S*^g@dv^l4c)IUChͶ]ym>FQQw;q,\dw7_w/IΊ^`^Q´|}tov[9{IT!wf%^`^>v!i}:oVW=ZuwŲJz˲JŏUi=ʬ[+qH`UyaU6nzP *٪,Q}1}3ID]%IZ͙%Ȉ/#"3"2|fV) I.t:3{ҖXh WOZ.@bDZ!TjEF% +#RWg"AHU>'qڹȹT52@z5l4=;#^׭OOM XGy>v|& +h,/[Cb/7lcӦ/cQ}cfz~?OIϜJg[y;1:%fkNfOi4Yf7>Ek)VVHrQs $~֪7YjhU)DգͮvV:>,BBtBם8vеgyo F-s..5yPhQݙwۃ[Έ+m`|S򜒌pjvI/wn͇ͭcޭIgF[e/#N2AfBɥ :dh х騔*$B=Ol 4o@&AdN*]lNFJ (\g3Bď?vCkGH 1]2PQsh8'Rizs))R`oV4_Z6I0I]ov24_ l#ֽӶ@Qޛw Ѥ]Rq\)Y3#iQa|.- R \jpZK'L0Ki\Վ\]ma]կAfG&6Fp7mj!WRm۔:NƐ)qciΈCuE^qskn:wҴI*8 u&ƙzv\0~v Ϲxx@' IJb^1kwxb#7P赕Y'42#5n(JB`>!%(%clE^uaAxpAU\䰿bg!A;\sqd!(OCaXO!4\!l-b> Dlq^'Nji 4|Ų fv_ $a7Tp!iZ|E[kpq!)ϢmpK{\Lň=OjF)Gζ!{&)o{@Nv49Dw\5b\ j!|Y^>~8<6Bsgh^8u_v >(Z$fAFTR \sa[,}S!H=h _(=E {<] 5h]xSlؘ2IE?<{"`,we7_&O in.rއ_ތxX&kUn5}xd~~ޓ3 w=5^=?91+c{ptrB$OEa#DS?D%fLᾹ}*ks[R̠tDF:/߬f߶Î_=yۻ2Q&?N!1R(V{co!2/x#׏:&S/iωg%gDsK*.,Ù2dLr6zެt໔n1G/ 8,{ iH+S>*C5vBn9wi5)1Ȍ\@Ű@ЈN7Th+cڔCLJx\.}j6Fy!VJg@ƽg+D` Pg~IDxтmj4)f]A=IxFg^3ig m5 Xvu2ZN/ƻc *cC9E%"՞RKPN1v1M((Ũ| Y6Gbބv#S 1]GY[ZVeV+}V }V ]|կ5YP|j)oheKNBD$]?꫔ӫCx| .(P8G]z5T;G]0Nvs:co]k{V;iXi*%%/$cLTO4] []7wR٤?O73cPMJ ťQNLS)_[tJ9 5Ȱ*u{@f^Y5Sy"G !56$Yu[MDzb1-wGք;7)]Fs4[Mo)*gb~~ pi75ߑtcfz~eL2LfdlꪬkcVHOܥH#HFWꦾ&fO6m}I,WOsb+K ]6~>G *H 66EpB)!<5)]@ *dLIϤ w\6@P|jfC yj71 Y@ܿ3lU@NEmr2;nPnN PG!/ \2ڳ/=m RB_L 8eg Y%rADc$RKTS-"kP~c6x-f,c 3!ݕy'J>;El1qbYЅgOSٟjE,?BW0l]@2Гa{*(5 :C~JXUrr a"r.\@|d:R L}bvS-@`)G $qHP.Br$ c.F 0F2bH"M#A4dd[1M -XFy"; "Id5 P N1ASosY\BRУk)g%hӉ^f@ANzx [& #&5F"^z dl8]~Y&ɟEs?\Eg]拹ݣm[ s4EejZ]rE43afV'fiV-8>#uD7y$hwᮚ(sNE=f9 fO؞K9!td2+c'l}|䕽,tzvߍQ^ TEI{UeG|f81ˆY>39,O>4iٲ}U1 m?=Lm֯59wKK!8*#T2w2Id}'' 1Xp",.9@h ѵyrUB;.?;*Ė av$tq8ٍd/9_q_0rW~+TBBD{ ,%ɋuR+lz||a *G{&nEMOFM/ RYZ#*UUP_J]BS&Uӷ4@ktz& w-̄-t$Bj`I ?[wPswp:@xϐIn\QGH/VV{k.^K, c/cƽKAúq"ۛ`VD1r4˱ND;Z>& mSwZdaR 7ϮvnTecG9UhpNWGGDr`Co_}t? ^ !}r^~ಱ"cu#iC=2"}}#s> <שlPބU^i_CP*()l@ΑӞt>dzLcDz i.0J@9dq5kOK㐲WB鐟 f;}SQkN W 6k8;(xL*QAzHXS͡Ǡ]piCp޽Q9+-ǡh#yW='ZgäQ5A|$y>vDU~QY_o90<EEV%[1]VoU\yj@wBm(*0oU7E[>s'=Is{:0?5k!ϟ) ƐBʠh!,hVn D(KCrc;1@!a68EmnGΑ3C7 ' W 52 F W[!1pd `%uf熜fP%lo$ U[%Eۙ4 p9R^ճd%Xy?yg45 5 FsgӉ6R01}}ۍgd2 &;/\2Xe2fdl[y;1%+S޺@[/j0ZLQoIٓM%+:A̦_33|qf{;̒?U>p;r0?@8xw_9~](1)x"Ln#f iާ ׫w/hn!−q p$C$0ӊGBa P$4BDyԼO6&TX)*/֖Z?u&+kʰ?3_2%7SXzVso?,Y涍Gt^_}mL(.H<\,ρP}8R}g'A4*5 aeHJ($5gzH_!):ȣ=,l`ySo$IˬRUQD۰-V1"ʈL4Y/Yt+!hو{=FP.ߛ?:m8鴿4.mǬjLC+QҞY7"lb^"9ef)fo"ڜSsCLr/|d?ߎF.?;?tewv<wmp\ݻ-A駌i_X@6%EU[bh{ms"(h!Y we8`FʶVo6Ƣyƛ]t4ϧ=?Z@-CA|  7ȊuwΖ9NoGӎM,t)u=voNjYܷP) M ehnX8hn"=iaUCݯGF"Q#YpzK6*XiV3cPZ"@SL?Za.~z24Z@Ai4Q, VZ =wqsvrt7]7vKa7;דk9`t|{Jh(! n9FI_yRUiZhCe!!#8ʬP7utT*L:!Va/S)gz=Iުe RqǣѼv4*U:*aN^'AQRLm1-_)^.X3ffvٮ^)nl3_Z^n{xiՙ7w'Zgua,yڒSVM܉ srjŢ4WLT¨arXB(tuk0i #pVH{ =Ag!Twu9 )Gl *TA+P>lе:m;q^D$oP%iˀM])DT9:u4H(-IU>T?=^=(>w333\[.},u;ɵ8(LX* ܡIHӂj}J{\޴FM;Qŀ )Lld]EUI0@$G8l$2? :΂:QS0&a+ɺ=KiaM='_\@y@.40Y6?j.q24ww}WvUߙZH@ r5qI*(HӄP^˖8`eޣ^ 9#_0-@{PxtnԜtl6'&3DY!\>gMq7y/|?QIroqݵDQi.wل,F|R|g&FɧL<- &hrUp CH+ƾY!T0VhSa E$KFg+Uz>L@ؤn\̾ޢ][|rRP ]#dIz߃mf˘IJXט·ø{߃*"A8T@oپD!.v(OHkux';펟+~11q$͉q<;&%DEJc\<֣' IեqҿLWaFM4]U !i[J7oqv3LbjWs&lŖU[#t@-:"(y<f(&:5Cu'6hBzAʳ9qxV, %"L_W(9NiB zr_ƅ؁ Oh:RU !L\s]kԫd?ӓ Ưb]v%`9p33;^.zAh-@*0"0% \Qj!`VQwnw%R:T{W}Ξ>^[^?[pp8Zw`.ϑ4Vq@6Үx{$AQ ]+]_m1BFow6(TgM$88[L/:;l=yR 4Um yF.HY+w}BzFo:3V(fEbETy|=}<&35KXU VxRԕګO^6D8a%pv( 5G%Nˊ?葆+}0|x[77J{CԩްLR uϺ&`SL5ŋrFKs]00a9 3,aGs|+"skϐ$x$91n} |:X*]ܴFG7ɬ<%7gWQUJq+ټJ~fgcA:&s: S1Zip)0$,*:d!}T 4놓u6D MIل!p8 yj~wx8ޜw&ɓ%T/|ޟ= )riG_ 1<)iq峍a48LU.Wm".* uqSEb V9bZ%33NBI>]6|G\"/SK4T\qúȉ_m١u tcNsTyN䎁r"˨4F/F)A@Vh% A/ I n-@uWEHrpv_6,˔zwugy+@a]k飂9E~cϐ=S0j^iߙ5ajPByFHBr .6LrJJiN! 'Xy ?:,FGZHvJ$.T$ѴB0 C 6J>6*U8Vn֪2%9H%sPP !Jq|p[5@!O|{yk`3?(u4y~ J( ND7+%R D^3)u~c'&*hJMZItfqV{mx *g2 dHNm4)30B'SzTU(pkcL(s:e>xe0aò,.F03JUv!8b5\ittJm]BЕ #vԨ0iQbCJLI Ñ (&]9,A2z%(`U SF:Hp%Q 5JyiL%ބϕ04@]rhGФD;@sEA'2hNŨjD2 S  +(!  jL@C4$#!26D2!jǹ z+xz'4mDCI[RUύhP"@Rz, Hl^d)H4Eͦki)u]ѶHHwh%6&f#.bDDnDm@E.l8=no9gu >?iܡC'˷{vzï̉Oڹ͉$#y_Of:o$zyqXW7 EԌ>tp|myU g4(<~wu06 149Fv:RDo.;S)fsb`A+S]/xMijA򨄭}O|6%0* nU'^QqRVz5 SfhF@ 햨xVo2uޛ2:* fಲ6À|fsr\fT֦|r9Q iS68LkSl[[7kkqM<ĺ~.XLlrnBʐ 鄩ͩT֕RS񧏇wۨ`Tz[{ U#A$*m8wzT H*hQ* UƄg'e%CPQ4b- b5hȕ4pdvܭ CB{ZB5SCxCQ(nтv@ў1((k(ߡ2qFQdmO6 Ҩ揔$J. ٻFn%qCPeeJ%W-X8B& L™c鮼_ DŽ*yͿ]V-G/TH;E=Ԝw/]{Cw[YHqɐW ,҅ bO%Q=L35ys6lѫoG43hFHW, X/Ë3HK0eB/Jpx$JoJq%T` W11QmaN*hd;u{p;Vr9X-Ȏ3_o|0K5}%ˇͽq$R p!JBj#֒:|6}}Rdfu BG+f.XWNe^q5gY X9%}3; (qd}ܧi_Uߌ>%Cb!L.I6Sw] #:ЛCRJz"K|4_o(Kjk/dKD315^b%P4+K 34{qӍ,֗ȧFַڢҋNi jjӭ_v̤b+u_[m*ߧ;*,rb|F;vяstMc՟ ~ys}rn=QuoNRv'UR.Ϸ୏>Ix# n0F]y¯Shy| ajjy,yF^?>̶/Q;b,Dؘa9R-8:SH 12D!q#BabG+cc İ VD@ Kε  *R<6^{)"A$ @0HH޲;D0@U~P٫t-I'E_5෸(KU9X;yJYڈVj NHfUp7/A$Ojfpi 뵘W`-Er}OI$/o˟4ItO0fo]]2^oNw֡]3Sڌ!Ƈ.rqTU]eifY*HJXu!i~Dsg E LBII %DBm٪j ʋia"(/Zy$(wSW)!:b 7"]L7TCnSz= u5A@Ufq][5ca ĬFX) QD@* Lw$2i<%LlٲOO>&Uifh2K/y-w/i1"do :%b]u4jKLbt9/W%6Y@tZW܃SϨGY`%.iOq5]vM+v\mtz䊔4UhnQJ.vMр54$JM ކ&?q] h4&ʦVv[.dAll7Њ, 0um+bsN䨝;;wT'#&\SFSЩs D2Tyv]cpi֓Efr^t<ד]lzp& 3UV9v&%7\hxIO`t*H)"fKAm'9NyU>[q^OQB3cN0^n`HD?؁ӓ2}chv˜g˜󅃱"g0ڍaH;}h2 smV-`]$!, `cEY-~Z]^1.bt9J7RlcT9?n}\g!sF` L+@:e}Z@'>O2hn^}u}E4~ҡ\X6sF)/3l^z8؀Qn9FScԬƹQk2:zh&Ѥh'N&V-& R\YqȘn]+U/o9噬1pB.T XcAiLŸMu|q}a LDZ`oֈrPs3Aǜ2ZmT-,VEJrͱ=:h 1  0-VThr1 V[ԚxYP`KEdݮ%3NBKQ9nDuB%;TLXilTLL |sW@ 6+~R>`SwZ>RSSIMi1I9hG-nkgTh.B|X&罾jS9F+>pA^ͳ{!|A#B89 #Ƀ_/wڹhe2eJYP| dۀikRa`TAȷ5VkALÝ׵ RWJr8U_@YLpB80FCW% wiH,86|ü4.VY|!/߂Iv8_5iqUq~3 *9ݝrdZQS4))9M6'Ɇ:%$$ HN >59!/(I1Ns}pG. =6Ԋѳ,4# 9FQpʑ#fHawS; }PHHWxIKMa0'K5AC%Y:-u!/!u{¹;X&l%R_].gE"%, nZ|=r5GÅbv˺&%O5nB^熼$䥑Dw2di2S~ps˥ҚVK^NY3" y!tZ4 `!/uLːOpۜof$2((HUyC Hk"L $; C^TfFj+Y#s絵Cu*C9 4ìbL3JL.*j5f!aӬx^媖bZFbs1rwTi`Jb05 a](C=)Oskr5SCTVJ r昀]$qF"@X „ZTMHngk)8$a2M=#"#4 )ASg8 X&2% ׍p  ^*(7+e҃$(AX=h [/lؿR7-X}\*ex[?R-9X}C /y\)X}DK9zJ|YRd"j,hQIĨQsJ:ePN6Q_JI݅bu# nu0QR:YLj`h`018௮ G1Ra F*5eR:`{[PUqND`iǼ#9IW X RAG400bD FxU3S=d:,>n۪'[,QLw*s?:?!}0;IR .rJE%85a2+ӹZ S4z)-,\Vܲ>}%5_:P (+A2 8GFz>JiL]%A+q\ =h d݃ܽ)\+C"2o2tg/Wl* k5B@݊qgfRW.f7yDd jS 1Fݪ֭Apqiv!4JvZ#]|!4S\Wf!KQw?]V9b+ dxmU`Xa^t7~A=K>׺(<9ON0G]m$r_M!pD:q$ՃD:qGҐn;l]a[/\mw  }'Oݓ_caèf@rs(o=ńÜZY~rT:r|0a{zgoIhhs=;k&?~z'4Tq9DxwNJ {8?e&X31mStn=;[ԽaƴڣH!Zj_/4T~w:FLmMצ3Ot5:אaHڇ%q}lS|\BVX,a* D>^ * "8!V<9Ե!dԋ?HF~k7~?gIogqOE3IC.1ҽby 1ż7a(j?dtH$<z^ ${pj&2V9Yj&V 8Ob).- zaDTfW!PRsCA狯uV(gϰHj\I(̗#7i=&`R>0Y =BBfB`2b7vM5@7eJ-N8JЄ]P_w ĵZu(Z龬:*sɕ`Q$4o7vRdIu;$#Aϭ>bkb ߮^E=zD9iYP9![%GAG+(W{#Z^- Jw fqGTE;6S'HWSd4#}(u QIQLSXc*)IozBHۘ~Jv bFOEoCp8z(O)TCS!Na*z|4j*omܺdP`L`lQת#spc٦⒐}_]#9)wDAmJ.9ly*0Vj o~~r{]Gڥ,ٲ/yq@KjS:s ߒ I'6σMeZ'{`K#>_`iB Eo>W@^~" zzF:̗ړ!IPW WKՓ/YuC̔LW7t #ӕ/5Weub*.FHҕ>_]GmIQF^ <9̓ B4*f7g#oB@mP{r'Z7fW`'l%Q9!>g/ql~6^Q;R8lDeRV,p^Ͽ|"ў"ƒ+yJIZQ[U'花X'_L~?&+8/-/g?8-n0ҿ)xOEEEEEEE @Χ(,# s= iRcpZ<]Fȡ?|., bT+$vrPW.įZUϾ(I'3Mue?H&`yi>|ן\lA^?~1[~kFB)ծ2"PO=9V߷j*(RHh56 drmj9_V صޙ)|xZ-ħ5qbef U)2TDa)F|o7Vo8"!Ao)LRͅAz-2 &2!KNYx|!U@Z BJ 1uZS#` &;L5wywz`Ma^Lg,=N*|؂rZ/F {J&6;5l0ϕKYVrBնjulgH~9i&ύ'w@iAI\Ըxb_-M֎7UNi lSc/p_S:RjFi@bQV&RBK K)ea4a$`*0-L˂c$g:ʮ PXT j $ZbHiIY*AD+=l G3RjB))ű L@R9-Kسp +qDؗTgX lEl$B pڡIMIJG"^`&`&D@dR0= {x fTmGe'SxT3WWXy,p-PgJ#Wk֊+ xƔF`‰"J&C)'ă-?H_pVYPށĀW+!AFy$B, ־THa)a1|{` KA_=B rPV9RD+9^@[FlBl| Jq 2v .q_{p ȥ D E}waLFB `}rX_ _nryW~[>\NW (l:ѯtTTYS(Ĭޖ0B -(jWÒ0aUc'rlUW+ٿ줵~$ n)' 0!R%8zl<*9Tx`Fmn8cdI6!k=MV$3NNI9Ӯq&'Eachb]ź_bZcA_cEGkKDQqvDL] c%U]XvcArv1hb1Gkib,Xab.Vh R \1 .-*CVG<|`  ( Ï$=CIk.&i}IbԷ.&xbFHkŎ.&ICib46]L %k2*bDrXg-e! hhJ%%ex=Tj9(+%<pC_% |?tr\VZY{ Mmr"$S >pW}! [(MD'6-a(p*n7jr"$ST-]JB lhUHW.&E5lU !G.A2U3{|RrJYkq1gR@s$Kw1-dJ4R T1g;Qc+sleԙ[^u9/ᵑCoْHcb΄'ǒ7u&+a,,<ǂX3A =3A+2>&gs1ǜ R5>Dgs1G B.OT}q&LǬU9K1ǝ =O>c1+| cŲ9cLHcVOB1d9"Hc>3::"Lc>3# TFϕ}QgE#ϵk1gs@Ǭ1E1dJDs1ǝ 91")1>ׄc> 3l>fz1gQFj|>f>c:`9>3d1LP3"hLI $l6e5%`wOg'wfq $.Oa ]-?ʛK bef VO6,L)7US[(@)0i_wq }oz`ffWY)`_:uͰO VMxFSQtuG%QaV{Rb iQʢ`[ιBSɪSR) weyRz @j՟JXV*+|~W-vHjZȍV|]A//':YF}z4`Gsc`$ k_ QқJ^Oϫ-VX b05 O'z uS`$F>_tz5ycZϧJGYMia vIJ F܀1n1Qx_)*c vi!GDP/P) @̴prH j؀8RŠL #Ta;)*c\ybjVh,6KѮZTR")U!oB%)9-1!`WK%% pb0Lip !|QpƽU7 a~3ZR£gz8_!r6gw_!'g8/Nj2)I=bl15Z˳9WbqY<;BYW.z)Gͼ<ˏ>wL=ɶUBK SܸarèL*k鑿d6=z7V}K" Y߃2%\]<#>FE/ߟdXnO73).V~Wsi۔Qwff:KжFep]eD`5bl,ݻ5޹w2FsG@>(rtK4T i"#UQ"dhXF $x'tTT3:[C,h/sFknjql =d*UY+ cvXvY#yoFV(oNw3II~voR՗gR^ i x ?YbVV~ΎKck+MҮЗ`,: @l/ b.":I,GXЗ%n1=cd8iX>7!Jrn7ד_܀a]^_^̃g>*Hٿ2UZoKlXW{sQIT:-?DX)rAGf'~k0A=6\)EeUeD5Aș{*Xl262e`a#,`"mEiy%BH,ENt&xC;˩ W\8%jŠ%G)!AeY$̇%}$\+h 3xЌTKjwݍ.ZjTb8]6QֆHh?ͪɕAJ-N -b'\ y*SǶOY7u˕ARG1Xb/24 y*SحÈsϺq[ : dQǺT^2 y*S"QĘ yiYSpYӘWcIx l>{NC+AŘ {lp100q1W-㐚>GT_+oWfj.Fw> e2)?:+t5XUZW;CGg/f\LV= E! )˻!~Y8@L(ڌAi R 柫P s̊.Y 8poz Nfx&_˽[G<98LK-̝8!egeǫO\ )@/OSaÕwU,>Z3$~. %|cF-eH?!ĺ"JLRXHGB3Y912 gZyS4Z C~ =`i%:>M"Tq~zUFcuR%N'ϛE4[3%OTH}@lU5᪂=Y͠w]"ȑMϿl_#_n+S*I5?DR:W`_TtNPz)0OLTG*hm0B3.l.p+#8J%2Z"s?؝.)~̒~_H{Mk#D;",Pb8\M`vΡX""0HTj/bvZҒ#&< EhG")aZ,PYXxxfl*I4k$Iegce1F$4iK8F1ځp)g+/a#VKgl2UQ=֝=6y wk a:Yr/rZXmwa:DCLjϨVrhö)5fR6?4IIs&)I?$e3nCQjDieyzMGrӝCx@"+ 1JShZc+ijδk,S}iӦp4Rej)huM8C٬OϼՓOp)T-ԻMkqX.J˛/tXG3n|H$s ^B=i)RauVsT~I,]T-i0fn}2\Z.>ߧ[ޅ)Ȓ?4PI߇78KdJϦVmvO_OdJcxN9\Si=W S@'5QZS_!Rbe!1R-Tc&]3]ǦPA!@/jQPwV[\)_}'pR*9(9Mac&uTuzY {-9$=lrɾ/B*_l$4A\mo̓J e, IR=#V(8#F St!l-l!J# F1oY XE"!T!9Z彊/R}0N*͠ڟfpUCͻYfi`01`B#! B&5#a'̔dŅ XQw$!'\kNi™1X0QdZ8'[uj>١<$"9E(JM$AH9|9**n9|wz×,rIғm]9FIޝ.3162noܚL=GOg,F)fVD01BHZ"LŹ.323CҜfZXe3F`GV99A GK91I)lX#CA1Dx,6nrȶ%I)e˖yIbIm&^1&f ,S; Øl7W [& [H<+a  Z fq˖`Vi՗e\9D1s0XSJU [- 2 ^ =lD%y6-[9ӺՈd$ITFbC)qaZ d×@r"i<1bPڒ Ee˖V)a n4-/8z|6{̴``z|gi8yD=HGsij2CaTi"bZΑ->iBe| #4 h]j7mDp7$Pkm -M7 %=<<ERvסxܡyt0 uh(;D<]?[G.ɣI[Z94u2kk[MIsMe1GٲZ2Δ5P9ƭNlS_w"1ELhT[0p3lu$Hx wbc|%)`iå {_ &_tVCW8v.> |*{;ԔCOt>@h,8pQQMSIWn_/|P\nʾݧ/Ӄyl.aT.n>7V/:d3h#@]WzJхUX&:R`eK0cܞ vX 5O6o "n_$wORWR9FIn<&1卑:bb %O8I- ZKn\skkzY skɓAP(<;8cH?\!z3;+@5]Y:Ņ,xv0: Tu׼F*Vqr~(yًf™/sֈ t9|‰F6ݢ}O/LPG=}L5|+C+%&p\XL +x5UU'0A'ک՜ɰE&hy)w"rθb¾ZwO!XcٺR!Gt>|,r A̫>yμi9yf1CUm AѪXa]qAi 59)X[*@RkSESu>zc>^<)J2FpUnnWM|.h,hlY % wzm'|Oט))fx$]L!V(LdaXF{, 2| 3"\Ey|}]PKw >9hOwPz7}ivR5 iu .{h[3;)+[j=\[լ ~OrR1^YND?X{ yX^&]簡8{ڜF_aάz?[7x2Įԭ6 , ={`7`&q[ytta抬:jhdAa/zħ))~B~T2Lzi2-2=߫r,?^P|-WQq"[#b0>Oacz^O$|( {xS b[*BZ5#qb]eN6J-0F8IM0cцJ}vZR%cPɝ,D$b' A^xMq&!T)Ds,$B*ȅڒkoE:~oAM<9V(+imdP{}SfI$K䯗nXNȂn9INod҃=*ϖcė0xJ&k@(;Al}ƭW$ Tr^Jw೒]L}^ezGLlq^)ya?;U^Η܊>7.ɒ3 k} p' |b*t|k')*ID6D Q"ز $N!E|XF1Prcj@|GkFZbrq{"N3tzYOFIǯVX:kVX:k AW'*TfV $xXIRJmrk`D:aއ&pTڇͻH&ڬ 0ov͝MPQ*֏CA/G0\0Y:^ڮruR{U-g~_ 5bq*\P͏-,'1V1Bbb(,M°1LeꔳRa@2LaoDyFHbDs--,vTIXV-\{5xoXzb _+X~Wdi@Q-%&g$a]&&;P ̅#,]J,]%YZ4pV4e|.JER?Eĝ\u0*FKZT =% V*I,A`Qb jh-0籰N'JvOvԲJ8Ռ]~Coe6ҕJ\JcJP8՞D.ESn+祝Vb\$n<ˤ]ArUoq| 122*ᒲBы=k3p3 ]8R9 YW-3T8t g3B)Z$=7AnݡWuC^Ba&LRiWb\/Z"̷{`h*ʙrf tY z+ /dYuKư)pEJJ4;/#Ms嗌&KFpZX8*iahSB<hW OB9'JmR$okd0qD#awL6lBlHj!!B\QO 1IwtfzWt3ӗ];&deTQM'A{D a6:K딒ARd>{1W %ft*)"i)B,8=ъpװmE(&\Q 'H"ګ8ObnBFwxiERǯf$ˀm?M Lc"Ai4p(l햅~EgNՕy?WKcAH{v0<.,;_{V;6]C^k߄|&I[+m>:l؜>\ʓ0 06x } xapxwiqSgVdh .}GOovin&*N# R4qNLs%+ !$jZbj LoE 9ﻗY1rYfOӕ8R#={c08|>'A7Sۯ'p~ n"ԧg'omy֛翟tFmPq6pns~<&5'av}ξ{ Qv{C[W+w޽zxz$J"I 5dlEGx##LRi_O:7nWrz&]pMU,3&BnǹnpsiS̞濞~ l FƅwL:yOa49(oG{_@! r^m7ld[BxL:ɬt2rN6Nxu/YqL]Ṵ=~owC`fI| Ɵ@E%\o[B(|ფ6t稟OJPPQ7E@Vlnhޗ ZB_erb;H|ج9س[al? |s/`GQXwd7]ֲO~_:C RvJ> rgs8܏{_qqǃJ\*DKCGw D/`b35w2:Pfk ,&[ccm#NUɼBߖTv wzq{ఓ= 겖 zPXώWO**%ܽ!^?u_W8~WZa?Z43M {سN,諳f2n$YJ?ȍ*o5$^8 ?r!8sr;ཌྷ5eQx7n+Nfq+3TVޜn}jL@_wN'1yu";F^7".ulGhdm eND}Ԗ|5OOʵx_ʬ1px1I5 c!-3YlVXlVqqX2$΅ ҖesH3AeB-wQ ,CPU*GA Ⱑ\0k ֤йIQ4/&%Gd ?걊$)Q:0˩G wiBD`ډ3eLiϵO KJk$bq^و)qZآ '4M `pOڹ RHb7(V p\U07!ಂ`uׂCmqƣV" ZkN8ysbv J_ 4FSW78rk@G+lڛp`F O#n a57ʓ.Q͈wO'=# F',#t\*E1s.\uDGVkϏoW2WXoACRCM߮]=;X uAPױJ\\fHXNj9pFޘoG.4ZP-bsa?f`?ѥ;-K$B =0׹ݳ!sVv)NG\ɞFS4MC *|Ҵqh㾥!=xy%‘5[匙咓 $?R&m]͵TDXIՠ"`K$(XnHhZq,B]i'gkK)nZ (A!w숹Ƃo!@#>2 nˆNJRuPh[\Sj5l 2FXφc.kLE.,Y<,v:y0(b9`NZiwS+:2; ax(8v0ۿ:X @&tP_iNaJUb)JópeEZp`𲚅FNf:>U@2ATwl򎢩kB(k@PCp.7*ft &{_˴QZkPN އaε07UCh2 YDF=F 3ƹ7/75@hy$SjӒf}rܨv;ۨuFn~smӔk>6!Wl;'a|Sl1g._^O80<ɽut`EnD?i2k1^9XPlр&U}˦ƚy+Ea"QNR-vK{Kg(NȰC +,!XW24xlsAi`'>>}Mfl!Y*:kGIqK-54^ CPB<)q\ro<Z{H"uyA0V {7<|ryCK`ʰ^DH 2~l0#\I"ЖҋQ!jZj(ehO-bRp}g-J7((kEjɁ7zDZIS>^}Vw>5ΰ%3FH(mmxz=Q'2w6,2\= &ͳf Ƶ3+$֕gw("&wTA/a·~X$9 gFk.u{,G 1vjVɷRx g菐>(4Q-BL TrEwpQ|Si C JtsUe(Ocv">W2x$L7üD50)qdt, mxgn­e/wA_q؜Vʄ"@9O9;<*eOy~~\]걢M,$s,Arb,@G3Eʶ"o E/9(s',uWWUWWE*(#mB"@pإ *\o6Bv|FlkvQq̨˝"8i?_~n.tko^dox?`tje`{TJ>^K![ji9f? #+*1DدFќ`?`dW0I0eLz[z+?uRU QF(O $w ur^ jH*=&Uu]R}޾ 5J! [$V *(ǥ' F T]Q{+Oil!RlB[.tk2#2&Ti}2bEQ҄,j bQUI䌁Mh"R`T A5tK3c:;P7j۲-QUO 1&Z0i1!TW4Xlի 4RT_c^vUȱfwBVkn% ׻4.~{s2!!~QȚxM4G1rͰn !r\0bDm6̛AމNap6gS;@Dދ$W+ p~&2O2??61}9~d;Ch'=  FJ3k㌼3+;#3rlh< LLDHaC VRM#)3>!(0F24&""zgGs򿃾vZYۏ vi]E[ كHȖ` kߘ\inT i1ɧr\S/X}VۏG7xa2f'3 9𹹼)Xʔ W(GsD'~.+WpYjeӚJ`~;1F;Q'#-Ը f՘nc^ 0:b"}.#* !K(FïL;>D*ɿO2_M0; 5ə- bVAZiP"ݢ[M-(hD, `g3'Ʋ & gKUX^#CE!01bk>[uXM{7GiŘp~ evP/TjJw`)۳ '#`|6*`"pnlFYCfuqǝ9?[E:֑nmfϬa+kn߻'GTaS'k%2O-AڇQC/us#sdH%5t3j#&/T,C0tJ> :6Vod' q\nz~۩P$Ax#M'קB3oؿ?-96!:.N?*#KL&"T9F? ovLlE\-lop=" .b>A88䏎zhK Gv<(Tj<[)}߿Wokd^byz[%!['ǭaL VR+=%ztONȌã^8Շѳ1A 60 adaSrZCdV|NF>Pm"1!i DPG_Dar=4"BFjqa+ݨ&N`'zl7G+?Qw+O/5P[B1 1YEAXs{vwADB;#ӲG1rIYd+`\63^ȊKPJ Z74HD@rΥk{x"݅Þ0a"XDhfʏ'28Glp#wl 8]M[l$mЏ54˴nl(mde\(5xUZʊYSFLٲHBRG|yqC1|6֜ΐ6FQ, k3Ld4t|*)L3=RX2S&xɃ(Ƒc00:&֙ŰQ!9H~:]vؠW0E3c8z`fTM 7b#q?q&z/c&}{r=2э?9/ttd6AEXLޚiuo|1+tq.|)X_nO%ll=P_ܴut-v{4#I;xtt`@zup:'=Lcz`7ǿVC6aJ MضoJ'h˥㗯O//.O/|u2[ffMͿOΏ//ߝ]o4z !ɯ:ưL&\7^ I6OMXNْrur|#-f(&W]wd!O{Ͼ{ !uZGGKŽӻ|0;OrU8=ݾ?[B@[Owo~+~aȽ}ȿNOUYwaTvVϸ^飓_C:5}{eDž `~22 hs-0;{ӄkfcȔ _s-`$sy eZƲk|KFG (xTOM[J|q%CnY\Q=chZH\\uC4RZnHok!w1/Bsy!׭gSgulA]*AUP=| Z.G ^݂k{MFSd>gYf:}Ufk݁pNϠ͗q燅Nt{pEFa,'&qulh J?`WJ><>}3_ OBzd藩gq7<4u"c㴽_Ga)2LQJ߽A=OT+3x8<Ց̇Y?4oXkz|쫉uux2vɘ]*Y'yz/?W?-NRr\:K>sGN[3\-%2//?煉* ڌ^\GSsn |>ޔ/)YZV;»/32|5EAz<"?Ls} ~[ytyXg[V8'<^ٓ+{xeO'&BJŵ %b(_nLdz|B@*2P ie/ʅcgn*ͅ+kAfI4~Nt 8} 5.O5߿%4,fr{)q6U+L&=s5cm#ǎ,) a.x 3Б1\T $cRłhiGv=Jt ȚS:W&am *®]3dAeԙ6'-%SKF]3XDA1z)Um׌:Ć<#@8GHmRg2{R.`ף6;b)u6JϾ& >xB\D8Y7WGDZ0$FWi]gχ.B:%7 ^O\^܄RոߪI( k]ʶx7Se桔S1q&5b+11ʡ5UKK8dQF;kVap(+z A%(+k *T Q#⊁”J~T w $ $cCYzeg镝Wv^gip9c D܇a4O>5Վa&r?.QC࢝x"z"k6m1~:.wmQ>.kq:mM))rnٿ5ۋeF3EMhFR` Ba^[l҃^#6G"Y#Zsiuo=/ B]"O&R 1*v:Bp<+:>akOgS#΋QYH D1$8BQ?}.o P : |CIIw:U/qy{!_ġ ==& |bO) 3`[ǣ. 2tӈ`\=zi' 1EѳAO o8\)33Pn-16ڗ^cV{:4)xkX]Kti+n'SiϕXLɝxRxf?'kTU 5}%x9O^i{]z˅sqQds| T3dƐi u.ؖW4.wI^7|]a\5-va)BrPƳ\:V*7ruiRhϸ |57FKe` .*+Bո~#QW[ )ٵƩ;Uƃ=2RdwmA|Ժm|B*ոGlX]5q2 U^N܍V)u}͵R4wĵm[)G(椵l$FO+Mx%4 pјe[@4^r1t%ҭ,tm`5E"\ B5B ƪF *zMDZ?MA$6Q QlDظ,e$hxLx>nZWv?Ǎ#߼k>00afA)Ѧ6[^y}#lvXlfx5#EdFdFQ2ZdFeU&2}0 8׆qc;k= 3F-f$gMU$m_ԊijVjI׼Xa:Od,(YQ$%)+5`v#(aZ54`+]{(!.1V/jm_^t8uϽfun=MK= :9ZmrwJ믆1c ZH 7:h fљh9!H5Ei-+T֗挬!y* 6Jno~-W4a^ S$PhGs%~d驶,mtAr=;g kd( G5C=T!T cCĤLKN[̐b*>NfHSXn1BL38|U l3\jGm35s>4봏 ɒ(&굏NpIk͵pS6DEUb?&G,Pqj5R ((cFUpsl~+lkI_fUh؁`la~'x:T C3˯UgvjVʟ?FFo"h)~F3R }͉*P}6(R<| J;ӉbWXyI"zӖw|JY{T\W4 c]g5:|Ǘ.>CGtNRMDuUgJ,8'{_j+M% cܡPܓ6666XE*33XAghD3i4,('Q["BJB0#K/`y&ED5e12'r;.Yqo٠ќhBWĥDFJHy`'(D̔A("PH؍Kbu[nA^ܴ6+zzK %$omauhb>I R{IG4퓁8a[OBhlb/4Z™V0Y3i X )Nbƒcq4pVq0L@Qw  8Ü}0@dK)@l{ ]踺2 ֐OE%i^t'2EUnmT.Opd˒L`(<*FJ/'Θ|?kY%-G|i|ׄFE&n.؋Kw\ fq/RUo fθpszqڕOU1 <f R;Nbk~~:^z`_Hȹw7V@cr<[fNd#sA^: .RH㜓 d#礋]G.К@k'ߝt, |7ߧaN_|[kcZ✹a@9ojjZ0콛<~O_omo?fW3A-P5Y.Y6??OmH$ČҧlɾSI?-Σiiy4]Daz> ֫u#^lv]A7YےŻخ}]ƅ'w7ZkЮTr_e=]t=,bSL8OSr9<sʔmr&w1MݪW\q%WaJNbVx-;n3YJ+^$I۱ѵ帺<\[ ɁX]i0n,뙖;y~A[sK]&n` ~`j9A!8P}ynڲ&<]VҁQqaأBB}Q@%yThv/7'.5r9MnQYũTsdϣ#ZҹK?U΅2}0 8׆qc;k=1j1v&X _h/G'{-Үpbx>>U}~Vs5Jo~ MQѸoi T~h #x4tWR*"V`, 3eԈd& #il5rDi]ziEE%^O¬xכ>?oya>qmM,_0Ygz?Kr*bʤЖj+S34җUa8(~n( 7\$"./Zӈ1pDƀ9*&Z?XRS(_..iKqr`rdi5Sjn 4N8߃n3~$?AZ{ZhNz% ==?p=#]bY2-1|2tPϗ{KURKn.9|/ {mBdRk-5=WVSZ41`ibD&bȥॄv$D:-QDZz&c$"h$ÊzFD V]D-zŘNtO-Ў{rcVTPV|CF2tn8?SuSNVy5#%>+e&eC))9_gr))o\>[ᛁ,KW?K3YIy.z j%*zbŸ/3ڛŅS'uWrbP=n>-\cB5{0k"3 hȱ ÔsLϜS,mw_5wg/W?M-bo6}}hο+~+Pe{%&ӷeYTN :8?gxO/H5>YOh|XHgyp^Mg ~?/vӎ~+կgI~}(Ii4JR|>}XDҬ>8X :8M J8- O ^N?||Wf7V[Іhv԰푅䮭 l(F W"Sˌ&\Cy;\c[OXe3 rTM\Z -=b>簀W^(K4 "wFǭJIG)UV  !y-U֢z>$x<>~>@d]ڷ֒rHwXF+'鯇 \dͅhU0m~ )1' kBzw?k s?0ZxNlb|vZIz2CYE;;JƓVNH2'_gV8cڲ_?g Cvu~fח9:U*.g s#)#B 0|l# #a0 {e1XL xa(Шfk)`n}9EE΃& 5$̍6(xm[M!X^*UK޵G,Ta tjNW f7ϖ67C΄t-Mٲz75.Y=.ΥlY4 -$*IKi wmJ~ d/``Lr-lA,dҔc#"dXq|[.v 6֍mqWj@_ژ-Ojo&|TJ(ELlֆ3\mA: *plm$0YX0S s?Gޑ4'X5SaZZs-_6E1#jǴuͶkQum)$$FCx(c͜`DMU2tUe")ByTP7R kSymRϒNLNbbi]L̖BDz*0E,I]zm=zɮla]v,!N^ՠ.>vo>\vn`yI ]^@x ^+MT:rX /O%XOFNO#ۣʟLhd ~ٞ]U]z+F!`2cv++V٦4Rëu!B7/3FN rBv^G/q }\ 8>.Ꝯ`@$Yx73FÎ\ ⯫swܛF5Aq^eJ x+stJJAif08P\Wy]K@0WbVz/X HNpШQ޹tN46iF&xĘ~rRrN! a zTǘN)wLG]>[T_sXOK{'-gnW$B pq.kxHwx0b4v-buEx1,&zvtBGe4slɄ]0M~!]J >sUZ[`b+5WWG/.9}H#I9+ЋErJ4&n`ꚛ5`W`DLAM)U9$Ε:&xXgHdU!a,iXWeMhOꜰ S.Zv<]ІG!kg?iDk֗Tg^黠Y@!: ߭ rmD]<dDc}] Hoj[ c+qVwp?_ջ "OROtrUDF*){ 1$! 9z`[^Φ+pzH@N?/ ! ϖVv@ a$?ΉYWGwY3B\ľb.|ꛏ~fcBOӆ ӎ :p6'-€x<3ĻZc_pfNȁ䑗ei4`ܹ(m |fG0H/δ" |PI/x`0Iy& m$* Ff,Kιw}۠?*κ`2vU90'RdU j)QFUILIq]1u_iYx.9g޹pV9ʉ#XAWUЌhU(q`/(b+׀dÑGD ;<"+ P JIA -0d!^J^U@j+Ay0r": rmxy)(owYmU>7ezP Ȫǖ|w -TW7~z`jJĆԏ@+gŚ_7~R~f$/~Zugk~{yzRdS@()R==xQrM0gj5[swJY͉R#lɸ7/Vs+xKa|o9XW=-%Í5x񠬧{53zA&c4 $cڶx8p}\Vuc?CUCdjsf?L#]^952@\]ͮ/XOzw;lX^zەC_!mdʉl.S1sl"=V{׺s ʀ *n |gfn3{/ \|+qo% KWq'5I')N zkI̻)]馈>.og% 8,6!EB+8w^u3ѺzQc{$8Ug=VJ$רb<:G8Eߤm%Ñ \xؒ6FƳ̓g4Vݐy} !q')ɷmx;;;Rt1\ \Lo7 ID:2&&w s6hZG_!s:; ~s =MnB kӣ#͹Y蕴 E@i!)2"\w;FDyDFvB6|Υ;!Ou oChIsjW&.ɤꡄVD' Gj3 x7v=ଳnu1:j$Wf43f@:qQ}|utE|XSxW%,a^/WJQJH4$] Y Y.ERӡT4&hyIr)EXͩxYs(iΕq #I)jVLEˇ9vPy/˨EmDuqˢJnkQKՅ /4 LȱqNl(R b)y|YTA.T3H|aq{9b1S_mGtOFyǻ"O~ 4$(8DqIB&M18hQXZC%bD^WY[؎pr%{Ex:w`bi)lKA-؊}%L`?yo^Lso)!5̑ m!ߓ$jb*VsY˺XߍAIH~kH}%m-1ʼL Ă ;78Cqᅈf:_g^IHOu[Dd5\=ʌ0dF$sӘwR!I?Mtɓ(Y벨9p1[nuFfӮR L9R&Zqt>P{iV~5%*R)ddUS!sYUQ]"Pڶ?{ǭp$\ڼ+ #I!6C/c:aFP3׶[bX ,GT`;tydGjU&J/DDhvrcuuIG(]V|i8?NIMtYϿu]RQL3]Oz,-Nw_TxMӥQ5.f"YT^b36?jI^Ywh(}~ęAxQ-7DeΣO㠓GLw5VJIS%ElA$x-XhRmAiED1H)-ĉqdnq 8R#'%:8@QZs6yՇe*-=:_= Nyg^k0RUȰhĠh?U REXAרkNaIeUZ}_s9(J40R588RenYC |>LJv\^׭Y/|~)[tF+>*c_XYi9aK#PJM|חDM+).gLu^>3]ӓj< ے@oo~+߉ծq"@F!A:[ÜΑ5)ޅ-W)p!|dCW= 0vvgowZx[wp6Sk׶Qc= Xrf{)I`}]% yBӝ`N:JhZKwy5+I x Btq\.mA=I/Rۢn6oN_ΔsynG8]0(.1> ޵[utuX /wFKz؋ӸhWStn MdNmX;`Hw0:8i  Rw cI.;3 ^󎖊=9X,ijLgDvJ!@"A< ꊆlNt4KGk'oٷ&3̼v"JA>kZ;غ\SObmaՏz՟+_t_їsjPϦi]]g l 86\K=}֦Q"ia~3p@&]sP4vXJzγMfeΦ?+s&*JIAe- 0 sN)Gz&k$)H{q1$pvHp;DBJ|ÑZpg- dLdm2D@U d /JkIT& ak@!¢'"[(%9yZBtVNk)lf AGl1ܣ'8ߤ-2Q ڊ\+|$2'מ鐼G]D`+~X'<Pb4j! `.!oTs:˳dC\ iR㬪 |6t O@9Hag Y&>ʜ/`w[ o%8 EЈ3bD s2`M  ׃s#Jwb$=K\S%+^E'| JN bmN^* !bv iZ[ćD9_^2q#C[U/!$$ uFOtsGώtBѻ}2 Y x1Hvz1 .3ȟ˨{W1 jS#n]9$3z q}^"ڟ䘌q[5G$ <~4G1D} B̀^" Atw#8_[b=aF{kIDh.)Ӏc,twu.h=:aaR3cNH j֟m-)S~ sz fa[? CZT=Ơjj'C!wܝrwz5x:$E]9ǒ3c5E xHEIUeR&GVJ5AN1e lQ=~*f \_a?]EMrP-ߦ;ү`\:|g|PLG ߼wKM7.SV?~ &xPV&d')FdZy-Y]+2RqUPzX,H մ*0Teu hrLLFhKgZ۸_anխڲ[M.;Χlf]JTVmP%bЦוDpqhen;ޢSĦ>iDF"iN X4qTDp QJZRpM#NcDϓd}FùiG"D1Ă! @)PNSå,DJRI{ġ R*s\L:$N1i^ӝgx4_/f J?(ߥ(}lݳb}%ˑ/_\ Y}|Kno?|>vO~=@`<<]11@tP&ާoLgê pA'[8Ϳ_L }X<>B{/.ټ2ѽv;Ra"}%;)1!F'o=BWX#nyIZYu! X{7buE%}X jzc]!F-7U(tЬR^#=+kj癫bq*u)ъ:sncϷW~sm$j@[[=DtO~iJ`(r6s]Dgy"Wy;vs|2}e^~c3`gV6]+;HVkgxݿ-^GW'\-߽a^80!.#ͫs% Ij7 @eۻhһ5kFUꬸV"h0ysuN->5EG,xrۀ?cNkO]9SoX73,ƶ`,/M|"/\"o_AX0wz*,07(g3]Nk3\>IGd+{7/rV_=hM%i]5ò-Ӊ}GvHtZnȁޭ Mtæ 1"">n]jhI,Ѝ╷0Z`pfccYH(k^":C+W޾J2`Y~KQsY=FiSh$;"k#iыML)e N_Sv`}Ϯ&Uڼ t/b-_Cal2B֧f SoOSORlڿH{?x`(a՚ij/e.:FdJn c^zX=Q"Me#rSBa`Ax֮+J Wvw\1Gfd/[Ù(Eg)d?(C,K{T_ۧD =R WIȝ֐D9~뺔, 7H;{qfjHtaD]Yw\CccTvۜ28կ *d?(F4sUCs=V;Ssr ]+aKH?݇u.B<"U> 07e]Q!pY3@ևeW Sc:.s(o]~:wʆpYtF\Fͥ4k\cvkaƝKqq͈_=2Bc_]35V%ѿĖ[HKLyy*t#_v .$M ~>բ.=jЌۼ. )_>c\0Q[+?m AvO9xMOn,~.}ҫ N廏<@ tI팑d'cdȠ {!L,Ռ2lQjeTFRC ɰ'd  aRc LtDlSmV r1cIʙ<#Z$ NgR-84?,*H&"+-D pbH8%r 4-@PfӴ>T jj\i:qn3)FW }X6qb[!=%]A8<惍Ey~WrytUמj1ub_i!d:cVJaiJ fMFE=N71v|X yNuN`(JR>'\#q׏KQS1c&LFCbA1^#̧ӝgy4_3Rrqlݳb$Jh#w_$iU[r{ܠ_nQLo.D+ODe}mt6_jbB[ +pT81١k,KA!RK#a+ҙt Θܰk"S ;)z_v*8NJZOyw(Q 1R3&HSI$5v b$Qj8SWƤ.aX., 6:C N1, }?T6)֐Y BG(*)BQíp&# #&aPL4?/^_V *ѹmHBW0V3rʉ/Os $r$I$?ΐ$F`(&ןT8b!@ݙp Sࡉ<6<6XCK B_;$ɷ}$zfJz14LrPIQ+$Wû롒i od7_tt]$ѧ~4O rӝHW80m7g70h4$XS*(mKi={4Pa .[@I{k#nM`N*k{لgwi2H$x,! w$էF,z t"ڳ~ɎNW#N)]9"%rD:I`MA nٛ ԛ}KGL@,UMP%(g8U2EZr\%cV(LA~aw/?c0 %[;ډpIʌTdJZz;m$5 ,Ma娯`6T(ײ\Ȯ}pB>eN j%"duYoryZ 1er((_<^Yi&_TVrJ]I3]%GC0I)Uf‚ÅJ2IljJ`F)IL*Ŗ۔:!Q[ !Jr+bT*tH. ͗Ogp<_C1KD'; ![7hg3QFz,ؘA+, ^g E+ =]:-: } '*|7J+6㥦+u41R3O|ҤFaaŽgkG> +SKKK{Rk^B9<|ӥl:y-Saϧtzݟ>f{a103,^l)5y# M}.E; }ۉ忂h6?ZudwbG7Än;9sEiN{7/(0Bq):?Gox7pG-Ӊ}Gv t hͻ3z:,7(ݏ8ubb:NRY-6лa!߸nT Ѿ\':AJ#3<)vL*RZc)'km#G/,2_Ňqs"[p .[`ǚmA=~dK(%~),0;q_UW) SPfw0;Y7<l}Mjd3>AuZ25ZL MD"d#~}4 Լ3N脙#o͊J)>x a) ꣹ ڗR Y;Գ# ET겒8nN`e1 -ǟoᆭ4|^ zEe6_g *}Vb#PЭB6[AZ7x{P>10?&]*F=H|N )NYRD&j)A1lӥ#ss{ٯ9 :r`|uqU#st4ml1Bh ,0NMfýp V:Jۻ`t\':Tz=2a Q;8bɛ9m:L? =M}4x>"C߃hWdn7ڼ4LXhN0RGŶ٣3zCחt$BT*>,>o{$REo!33߆"!S\?Ivb1ohy<gnzj6$(r=NBw^ppC@0= hyFL<)0O$҄E.Dg<7#9lX,Jcve* .)άˇ %8/ʜR`̜e9r([:# 6Z 9R2Z~$Cpm}7 O DbPbK.JEA Rj^Ve2EK9f1Yje A@(=15g)Nz6%9U:%/p!B s&J;e2Zj-Ra00S;5k]<l]^1EKEE?H|ӊ+70x"l F/XIygN'°s_VXiӔ&Ja߅Â[NF9ݦQAaD_=bjxhlF4y-_;b2Tm@jxo9xUkCB~p)EO&)cnN7Bۀ֌)WiAOֆ"%SGT 벶RM7/Id^>1k<&+O_KLF/̏|c̐I6{ɒj3,ᇯ5w ߿( 0|V)}8}jƨ!Wc9_~[憬)Ę\ -qwFraMu?_ibz'#_z0!w`Zvcs>JgX@ ݣbvz5-㕪Oz2ZZ ( @WGû|t': :o˴>w05xƩeS(ToixhP'"4~K0x? rW)4$o7T48t 'lQ3L^Ϝ>5|> A̩5E;5hg1YY-GkGI TЙ8)ܒ]~Ԓ{P`.7+SZ0 Gy&(@" X^ .BZd9NQI)U?K?, C 5/x *A@jRUȭHL/ܖ,IW8\] !נiEv/9g\.һu˅ٗf750@2yʿv?O-#dt>}xJ\ *Y؟4- 9zFgܖ'-[7~IX._6C_ 't3׿I>ƦQp-6GKac?LZfV0:s9{M?d;(a(6KcűŽE԰ ~D1 jX"slG1E!Ov|k_\P`52Z 8-{W3͸jO3>b 0Rn6aiȚi{?}œOɺEKf/ }onw^ )aG8eP^]h*$O'"ʜ\NC&ZP Q{{&(1F̜3&΃8= c!E&^Loj23&4Y2^TZDN$Kˤ@|>X| bHAS,,x28p"΅.V%:e 3Ks!]ϧ se՗rwlmo6=^b{@0酹z+E{ 4 D@{3_<z7룳j6c(`pb@M2oHq}| I*x'} 'U 9^ UlYvh_s0o-\CzwJ#]{wƥpI (zE_s=>9#N\ R ]p`JzJFHNq ff>jQ?;x2k[tˉk$Ru5^P58qp)j*K y'QwԶA_`2׻PԹB毷 uh+"i5r-@뽂/%p=:ܩi~Vp+a/.t DWI4H;_?xqmPNGnt`z*m.fye( (W\Ȓ`EDbmRITt.?a~443@鱞zv|14ځEbD_؃U`V=X*0KV9"Z3R$%!9!@@DBi(f$3a]8:U_b?%{GӞ.L)<+ʔ3`*Ef!ce*R`L> B;3w՟^_4Dv(vE@Ǩ v8$Rgr0 NѶBr9F1oNzs礯37&9A5>n3vƀYK'80ASg eR ߅vsw1,zuS.بRJ`9e#a=cTq2VPjgʚF_aecKz40òi78 )Rˣݞ^  -[,YWydaCR!J Ly0.z}G{kYx:cӺi0eçoh,LwߓDEG/4{/7} :H> mbaj2AcL-3 P_W)fYC>LRW#N(_1Kr'`P_?8Fh2&u|N g>ϿXX 8ݵqm,(vφM`4o/AE`6F`"Y`h5X 7|4ͫGQom}4p89qr8a!2q_\׋5>q =5=x͸Z` )徟Å<Qeiݳ/qnQ c)hV%È/,a/` Ϟc\9J7BCxADpbywD\m6 FvfPкo.eZ@z&TYD|wN.LV͙[Lm%y&8>iMJ6\ŦO$.Hv[y1sZjF mg6]J9q;=(FCcgC9Cܱ!VΨc< Jt x4'@C_"$h\ŲrAm{`]}תv x\3Wwx,%:wTNr yCVlB"WCL 00}׭kg4]1ݫ٪Cs\xS߄j_  MĒ˜n5/\@mJB+4EDE4w !@ӂhShU3l:dzޜByYYToSr91.V3^qRu;Dƙ ؅\QҭEBybVǹ"'i|J% ZE'EꆷuxVqK^X EeI]2RKu&D]-5^a2(/[^L*T-$2rwW\@R\د1Q}aKobXŠ89(-P88^@.V.'cb1~o)$*` ufA94*Ƀ{SתpP, >71*_'ނ30@|PR5]IdJ=Z{yRx !o_I^WCxGw{|қ698?:A.%"o㭋_l^a IWw/d{23e2/~.6;4kԳ%8i݌l\GC$ i0`L񁊌7S/QqM Ʌy6hvöus}Fj\I~r2!+$S\hs+e l=-Pl=9׽ί.~*2 \BJ~Z 2~hTiV_zj({]P |{ >BӺ93?u,+gѨ M⒋Mg5c-;Q;I`.v4.ϕs~+[%c`:c~uT;bqt^Jy*WPµ#J:Jr}L#;ʪ_ :2%Y8TL+q }B.z.S/-τRƳѧh>&DB#9)#) f5f@a0E.?0!Tр!Bx`(X=)Ry#WXM 3:9BB k| qO`Pa&AV[}Pa%11> @'LTF  z}lܳi=ٷPmNlZ!.&xy " ѠߓD$ocCKhDAo}Pr—dZAcLrB}fj:V4ـ!Tӽva28AX~sǴ/wsy\t|#pv]R.5Þ<O(#"DiBFz$"f Bct$5˩Uۄ&|nm̽6ٖ$FgkcA.T9X禓7ɎH쯦^]b.W%(d}DH~zhc=H m`>`%)H`]&Ln—X0 )LL+;q7}ٛt5ɁޏdbèVM2R9ۉVE>؄c|y'EL^3ޙm-k];&+Z!یrq83 -'r4D`Am%bZ6j/ѶzŴ ]9,mU}޺v"Ÿ¹[Yyff9i.qƩjy#/ j MJt]9Ml:# (lt g)*ﴱA(PԻ 'nl+svMo]b.FKUoΆzh"\EqƳp1n-swd:[ S,IGJJTj׉Sz=Hkc*rL4l(ؘ*B%)X-WkAE Ѣvl,bmLs-yzIYXL_^8eqO*Ya;?&%29(ҮOE,aock8w2.rv6&joAϲ$YuxFuV蝐%jQfyr0k[DaΊ-9ݒ% K3X2*{ mZFi+}-oq!!XJ"sn&]ɒfR\r3#{9f$NNi%2 H}0鹺jSľ}5\t:#ZM8”gMyVMvAtHBf 3ˆ@{=3uK1VK*_r"u[W3IO.]:۠0 SLt6P鄑nW$337s9Wn録+тt^M4SCTm4u64R}Գ-A-=Or 1%T4&9Bs!k~<A =%Ů|.ދ٬k"yQ:D V%JIqɑGx Q*`rQA=xy; *&YV4̅^VExŇuj0EZ Ki 12D& Bip(u@aw$ $Jta( <У4d@ˀcl", #ʃBc ="ѾO}SH56bhKYL˹gu>8XM"i Wk;zhO_vCbWD>'w+n&ŀl/1@t\r? mt|pAcLݞ[dPK {/70rB 2xK8}u4!x(fx?rWv_X`3>Q@_Q)1jM"%9)r%rK RKĕRs_zTya`K9 ތ6Q`|B 2FPr,uU!E\=.)O b'xv>0X}ץ5q<fԏL'DQzZMzZ"Qim痸7]:d{iz|a{qչM#Ȏ ʏ(?ws~ ,IyHa |>uq L"7no#aǠ7NI}7wWnηdoWj\Jol*R8YkTgH+ָ1]juA:;;\K5[mD_ l 鬍@kF@NL+VIc,[4G5HVi#TWtj]ѭbC*g\vܷ۷o販i Vw5nc).ѹb:Bt}3=i'r4B*eAc+,' k'D&B0y{F6? 2;#E qeUyff9i.qƩj{C-"`k%7uq*2*!JƸ ^T=E(T3T=ӝ,$JKHՌDhE>d O#xNGNWNqSOS(+ E5N2Uo K$h/dgHkmEb/~ .hu [Y2$9teYC I+-ڦ33sΜ ;f1q8^8[ EY:ݼ"&Gj.y @K1]ԘʒZys+5잴5 AnRy0s-`6{}1{P?3{H2:\qaaϮtovVMXv@Ϧz9S <%PgfbbhĬ r{L}ڵf^n'^1{dwOzQr^L03;^sJ /.SUv6Y΁llw 帻!ݙS(PNU"]QrL@N&SM+၈aۯ,TAbl#0,Kw; 1#-|W\T]]q 'zxO&f6.?Z3J~Y??٪笾Em;V7LNO{=Y#>(޹Y=6 ;RJ1Og`J0!f9쎬l0[{V+ 3s#ã-sYGWHCaMS{:ܗ*o%6+4Mƾ#+|1EvGn, (;WFފS]=mQoֆM#?. 5=f}NRCr8$C)%')֔Ji֛h j]-qR6G\U'ʪwGyad&/ԝ9^Ц[ zF?-24q+__\h9v.'#;:3M쾸~4>jV8}adI[5ȪBԵEhσ6hisNw.dqs9SscT4}ͥpU>iđ42@"xʆ ioZ3@$zo@N\G.3Ntau! ,,:XB n`齮a9çwFS{'37ntq~2-sMJnᄪ޻#յzL@b05 ??EzU7Ը8 gDD9Kz.#`EvOf<3fP(1z_LMphV-74|Q`yr%lJbzϗE)t5։-0>/zpl[CR]zzuQکYV0:X+eQ|S87TL*Eun6* ͓&HKSEsJHP3!`@2aJj\p% 98%JC()TI\@ I.s#-dfeL0)NQFY@ґjAwBc ڔA>Rۑ5 x_E6S$ۚH"OeF DAdRy $TD4CJ #T'~f"U!tO$IJ.+$o]+kuw15̡8 *(Y*eZZsBh20r%M)KN4 %toTrȯ-Rnm v%ݗޥrzz=Z}h鏽&K<|'fz_ڹH^_ a9oo׫ fŊY%/opg|\?3]dbr!|zpo 4Q._>xR̉AyABU@|12y\Lj>3V;_l$C02ݲ*Jˀ(ݞd+csَĆ U2RF*'OOy=U\ۗxkZ̓{t6rDX঳.᪊䥲 flfToƟB9b< 88B8>;v_@GQlc#sAly a FTr.+sO7<3!a'*|sO7,]tМahD6ZU{jD'+:i G`(iRSes po?tbYrxA8~XWl+,8;`7Xav\m'vf8i(S_mi$m k9"8^bi&ZJ>' =),؟>5T 7j)Q=k: o2?Ag`[^ 8%C B&nkm"N`k/ ;2m4)v>̬K/ۧ'v?my?Y5SY 8IRdBk c<Be$JS(fyL(z/;%/^ mefm@v15ê/%|y|gv@)Gqȹo\T5rE̹RB 0 (Tf R ͗ rQ [aQUqcv$Y2M1`@JD#4I5˕J@B@.qFy($ ${b[w~~l }(¯6%‡#J8FȻE=dZ #C(#68QRv=8bP_V |vGv+Y)kn(s+Sf0J D8)ˉ 2eiB5;3ʡJb1J3ͭJp^=?( &}><)}\O7a1ctJpFO_2Zl>JSӷh+ SҪ諓)E4e[t261SMo͆z,>扄`H4 FKEp#8dnnmU% 뵫6 ? 4.i30xx;aȢ>13p'!O!Hܠ1PDgcxItr>i.wFDA#mscDE=NG3C >N݇B+tܚ`dso( C1쏽( c(ᑛPU`>9 F!YCS%s$f{`|hG6L9P[SkhC54͠cT# (Bsխ^V_w\jaF@r2+=/,ڵ;,MQ"R3 S(rE3iyHb 4)gI)9F^ԖKm&^Ջ萲hW ®^9}@\_I%tk4F"nGR'f(2-7$貹hJtrcƌjj'A4ۣw{*G={n10WP&{uvqP[0WhI9~M0n?cXtRaO5~n F/[TDᔍj j/݇YRJ]F'tNxK"&sEnb<ЌJ]y4[^ڱEYߞv{z{+Ol.es8otD#P8o5F% yGY<*}UzLԬ5Dx8¶oQZ9 lI*I±*4OHh!)@$$DIDGIfau^7*N(NF:[+ WSAqr,Łc<qJP!%PS(B13$\(GZ,A3(,{=P$(n>N%g0=} 1VP;$e j[47d;PP2*.3U~*ף /߇Z bŴh44<0Tse5!q "d|[ *^tHaD6#t"҆Un~?>AEFQZxCQr=#VQN ̘WR mf8Zz&X˃ij}Q\\^C%>:YNCd Vm[ϿWQH.;0Oi%G-F#!-jv iVqOe%z0飞߬opfُG> Ldɡ L08B3 31GȜA,e667.\8 QsՄMܯwnkǓ%ΘU f#T̝g31LЦɬkRcygLa"N3_茩X6DhWXje/6% zsSsάۘjIV kJnXG` cԇDH1+SP $Kdfd aNX6F]H^u"_E0aH]/D>.)KyQv2.W̝Zҁma?X0]+˃nr7:۩ &d9ʕ8Ε"?N 5%0KQi*I ?wi_P՝튮N8|$yS\Ll M"g9& r$ 5D.,Ma!L,cY R*dHq)u0LlOK8mOe8xHfs[pr,6^_٥gAurK02/6,7'mXT $iBӌ&s8\@@(ZI kJkaDys^GcsFvw%1G<ȼap&+ʰ1*]O4QUaTmWR#n7e8D. z NB.AZDFG[s$wサ0fѻ{Gc=jf6Zb8=|%ƞ5XpF5 %͵%!hƹpT"PJt p:$\ÚZGF循Mj5 1m_LnVZkv[Y+T԰yӑX]\)TԙN/YáP.dSٔ.Bha'i# 5}k>jGY ]A;ij',[hpjSԹRi%00-7*ˌ mLTHNfӻΖFMBrJ{uPjv6 T&ՈVUlG|A7Gg&HetIN]Ԉ"Pcjg=&s8&S^56F EfxO7a1sJPF%Tqr o4É7fڑ5P8]?~b8$&mBLIdOOKm|)ǡ1"IcFKT#F$qxȃ y _dKv=E%(NRG}g"s$!ɟNJ?;s̚LF%J\.兾X*5nΊsS_,q^fb|$엲<Ԭ;:ѩ)DdyT95x+]Jh],50!{PH&RH<@)\T?h9/:Rm=jg\.UHV+fsb -)Na^)Wڼ[oiQn(oF[]ݪDPNJ8iW92׭7dBlu آ2nu+ ^:4֑n kࠬukxBSD ?.1^c]_EX 8E01`Ϣa ͦ1c(OdOxG$K1$@mu\o;o\%H|L*#gt?M,ɴIŗh:6.}N#,5?_Ĝp|]$;kE zHj%ߣR-<,36 B b87L #M:_l~U̢DV}Bg݊_$S7IklޭeBHh6եr igG aS~ ILj#)1*S(.U r>ѥuR_Ts6(^dۛ˶+?_%(rR0My$n_qñR7=/OH~O۞֕|bHMO ŏy)?37ų}fˋO6Axl7?AЀ#/izѷx>7|)H"ĖRlס(P`c^&A@?FQP%)mK?A-g׸Gނ (vaQ69EaK׻ LA܂r+!' rw `V7~z`֟V9)`X{Tޛ(o pKm ,}ùm%' lj#@7ﱕ^Zm;7JHЄ7)Cu?B|y=`773ۖ!CްJd8o75CۏѶnop)Fp3};Ҿhe7}wN~1PJ% >DAgP l#b⍮U ^E:,+׵ v2%B( IgHxRf"s=vLhGCS( tcCD$hbD >yC5!iq7'vVq5*5 g~E*[e%?8,m49h a:W84<R?S\ℑ)fcѠf[Cy@wcP0N;AB:ע"̣Hz} oNv-&w<- _C?7G [ [8xg,O!S1`R~`Y~3? -ޕsLmN_'ދE^xU=,kL}_w _8jLMi#C -FDZ/| _wmntχO/cm4~H4,GRwE`ę!<iD76͐ş̾afCLhgY !緅B+HOr fzH5\Km1[-.qM3;ﶠw;Նjҥ!xvvPLN7Ifwœk{$zm-D i?4Y_S©"1Cm,my,a& A08=uNJU|<D=M<6)m`z ÑwY?d AlnG_O_W+L'րxIجU9u+)n <{Ln=Se?RK qbyaN%YH'vԭxg~ݲ@@$n u TȳYADz.b,ܸ {a;%(N=S陗kÔ` crr]P+RӚ|[(=ONh:鴏>;p|冉\_t&xf"T6ݵXϪos4̜_~vfP Sس=)0,q`7`b@&/ k07=AwY-5pN V6c9@MpK3@)8\A-"H1!|5\N2u\ Kأ:ݝ%pb3&O ?W%Ƴdèxv/^#n G9@d\BլA(q)(NA&dgP  iDW)A ?9AK<PRՠ MFHRo2wA3XbԭLb.Ie5}erNjZUyW"n>ZdcE"֭?U$2s΃8Dm`*bt$0 ,÷h(ykKzYl#erߠ F m.$qծ臶2ui#V`0[㶕:7϶rG܋FY/1:רfTw|m4:CeC-RY-M* 9h(g/XLCAOmLu҂#SߢWHp~,K3SA4-Dޭ պ3($z]A!AFw2C׻0N!4cqz?] K^Jy\;-q]by0.~oη:ΩqID]Hw͹QMn<+òܟa冖'!PƸ^>f,H&4̦MgZ0/&z\^4V&wٍ;!@w|w1T-9\:}},ex!رj<C\y*]/q[sc4&s#X^t˃Vj>>(`#vi1I`81BJ%mkdě2 iWS E[T./dB-;r$\#HO-~#fZ[31kl(~W8-(M$1snzWS".p,9 y([q`"4hģ~Cw&IGL)RW2A}L} g'IG;)m^琒jDĝ iEܹUV.8aWȉU-rɓ+S1OE {;gW=sE@u}n%.ē)}k_dWE)\l^h[t"I$HM⹛*Px^=$"z+ j_"d'wdl=ב/!xrp{+%.eEw\ٹ! hi-5J,1.%~?qҥ.ȏ4$?]2 uw]cG"ԉB^X,K]8cS&yv/$:ٻ.XA.216х}0}P,CI(G`a>A<;7SpB+ ֭s}$R7͔,ՠT.A9#O 0RG6ʺx J LYԿ2 m!sڒT^n%$gho3'2`XsT"rc(c%Q%Dy}㶧c6_]صA/~3OQ<_)3x__^F}qFn7?A"$o"}dk|)ӝK+²7vMl[L͆/ K(m g~ތ`^b4NՕN/aoϮ_ZuD?*)ox8zh򺘾.n(Гbz}ug1^ .ճG]G0N;N7]JId0lF 43*<1@77VzclD*cY,6~"tWP1Q5={数uX1ӫ-.,s C :qSW r|2@@KCuZvNTݨ]k~lj ,Xo#TI[A9NJg,lR4R?eIDs ?< wi`՘6osqd*.nl3QP3s}ak.wdϒZ(VDZ_vfuh"L ɞ였=`|͸|˪_4[|(& }g8&`c[hԮ玻p}Sgb*;#&J'FJyFGVrYUNVɻCb$QDDL & ĒN,r$5lT ڹ.8Jdӳ)cf7]0W<ĝAL2ANo\2j56Y\MA/kz=%B &xhkc5 5i*RsƩ7]wMҨ5}4y*k׌Ҳ5*8Nx͵%zКS~u cVˏCS*:ع;jdLT竞%LjQjYRb[k2W9qؒXu[!_uM੎%ʥgnNRWL5#䍐0Ӵq{ݡ:Mu&Y_SٹWZri}NY캋BPY.Fc*ʵ" "aD8qSVZjD%0 c͓'I _,ZR(9cVYԅo@8t1,wb&=K.0 8%-+[};bjH*Dk*̶ԩ0r;=tXjj-`VqYdxB婨c?\II?{5 C>~0#d:gja vSw?t1\3q6@{+Rvt{m[̱Rƥw߃i wrQc/,T.lrlll:MoaV"e:Dv"Fݼ?}΋{AߛF#oE:`}:<.0aȔ9Hgz'kn; (dYTLc!b8Fc (sa?ɬL̙4AM b]3O= Xś`6d{/fGLzŋI Z"{Q \a ;7,&(Fi)fWUx[5'A.NK>QY+i/GYhw4:^B+l'=,@4  W_E1vyEPrLDז~`=T{6WnN)x~Xn9F2sxr zM ;d[ ~HlS_M~zلaنQL%Bk]zw0AArbvG20F+k&&4&HGAczt8.Qc5g'4AI"bmD'HE(U TD Li+i08.FHJApnsuF+kݻ}}m/! DTӣYO h\CݮX *gS382QFnZ; otr}Ϟ{UtЋ3hYQ0֥/mu :] ySRDNkltNH|ө{Ww'+]W/LJ+ $7*"ƹE?>.DSfD'"U`J1y%՛yC/n#§I,jƃbYf,{ 7wo&nǤK%" 'Дޗ߻)T@Sm Ĵ+6{Iz(Զi[y2@+$ɲg Kkr˕)tQVQǪ̖lp% ׆Y}8KW-u+MurVbA\^sc[ GrZp6(zgX Ռ3߬ v"kDV4/_+$UI3uBQEP^+/0$Qy//_BQJRTSK c:-Dò!-}~c&40lUb4`51{uyQkI`b?S\)Œ")`DI Yf[G邟r"pEirI &bƱ1S(ę 4Jv?kI 8RXmJ }@ KR|T }/EQeJ0w`~;䥃1v;p6U f>GOv=qdS~! 7C>$q$a3J)֌s."핕 L0b8 ɽD+;겲+ʎ:(Xr.`/GǑʈ%D21Q!NJʎ7q6lvwC8iݺSᅽ].zGԅ@` "G_vKeuJRHw) L2ua[ 4t ㍞sN+g|7&KyC)Өbf\,k=HtsN]ebTT̬+`)WW}; X7?W[߯T/hlYUg0WP4'K-0!*æԌ;SM/42nfau@R:MG{p?-M} }c=]4U,?rJMhW߭3 ^Gyud۾%eZI*::~dv,r7GݔmDk^RynV^"֯l"r dr:GNP5Hሒ('؆ [h'$ؿ_dk+_~(9v;x?-Wj 04Nr~?̯ԁ[ȯ vfp4='j#bR%ⒿR@u`hsIBMAi˳)@胊fj˕mx7SUɍj>7 _+WM /JHVroCAps;7?V_=  "t~PhA5k3>XyqvQ|46>ol|~Pol|tc3o JCF+{cEO8 8R s%w1!: `B~>X-){IGذ(l)x.B iIlX/p8ג7pqS{!">S58tv(C7K5DW_JY) `@)k-;ks 9:ʾ^G3jV)ԴaaF5EEJ뗐*fys' =eAj67o)8e8x9e4Wġ湰iQhEL!ph Ei$-~oJtX۾ޅH3ZH-Y? 5ɦP9:Dvp6#l:JA*сe|'QHoJʼn#2/9,3RC 5U>tV9SyHZIP2X0ݻJ/F&9}RGG)5-̄&Gm@ G洂kD(4r:v0Q6h>eߦD3ao)Ҋ\ UEDiC_HMHnP RZ~Au@[yH"[sEa0;F!^Cdo@TybU*n'{sXnX8Eo(`(ޮvQJ?<+684mgԨ3^f] Ma2#<1VNO~볉-;z;Ϯj R $'#O "WLz6Ows=r"Tly)@P2x=T+.w=`* ř̂N\ZN $r$ZBul>XO. KPR+CsYE.`WTݥvxfÒW9fEr٥QS.J~"&35'M #A9(Md*$U$nqTWLQ4*} "fѵYۙx뼹-BL3Q;ه?v=ZD+Y)֓Ncx^馤,^! (>d!6י/q#Mջ?{9Yγ|H#-r&ijN|lqR&av[PdHQ-k-!I6`mX-ҳzBdM\_4R[XcKi%9<+Ju eiU=^㦘61L[;/i'=}xᗌ7Ou4o}475N͓_0\bӧ}yW:#Tm5FŻYC&H٧- tAb5t;W.O1;I-W#A_׫t!WeH@D5.tl;$vA1~ N %7 8a"N_}e0)>*GintʜUJO&ƥ^r]~~ן7QLJ8Fny_-W,W?72n˸i/㦽nڻ'ufD(*eg:B¹hc$3ϊyk C+&?E~/ _L,aٕIv U DAk XxAOlJkz1((y4;ŘRV+n)ŻJ(nHt;* Nvŵ'}-QT1u~<`}{]͗Y.jٌtʸ;h۹[fؓܗ+_i_҈r2:S1:[G(.Dcvzk w+ -ڗftKniD-P>!͈܂?znΛl&}4(~ReD.8S).W4X>,L%2+Hb|-VՍYigb4>Vz2Ȉ83q6Miiy$q)'fux(zlC\T*}[Ȳꃟ+@ۋgDÞ1d\ yƐASgITl 7c.Q9HlFMgቅ-Z`hu-rJ@Akp鶖=N SBQ#늓j ]0H裇 >BkZ᣻DZr٭19 d1i"=A3F-`lҮUN ԓ;N[F_s1@|"q˛]\4kQ6`>έ>^l~:j фVqb磽qe,J!vrK{ʝҩWa-WIy`]ė_cbKblUИ|3"ȅ2ۙ#T"TK شT pE`'6/&rMRU)S6 ˽j$ p3?09\9$\(QbsmYŘ+h H:Ag<Xs#Hrvi>'$<]]L|i|oL sc|t?|avs'mT~{:6aB=Dek_U?t]c$~Yg] jy^s{ ˉV7 E(J jscu`"Grͯs3/YoEȁUtێXItPغvv=V@`\ҍOD=az l!Al,8y$W\WKbw$IEk}͂`B|B>io#/ra?#U %+]7_A#wSiT5KSKR("B)"P'Y4[I?::w M9.rd58ҳ:*bjGYNOq`#N6ÙL7|=$.6$HfQ~娘N2Or l;4goW=H\&F&_2p6UFnav򧫫uZefl܏"Z}%fu+!{,W$R1p}v`eVrvLlMZ=`5JP?9_œCXs:S׉F߯NU!)j j? 50c?łQKءo`TJ U"ֹ9,7?< e_[Ee=+-Sr93R}L{%xM߬^/]HIR<&^]䤬׿TLXW@LA"-?]]jZ&?~IмŴ ar施iںd¾bUr'Dd4:T'!Z=8cGYK,;iAڱ&:㥸r 8ťQ=BŽskWޞz/߯[({AhϲОe=B{feTENL0|*9p*huP{Dz/wrmsnke~~UbJn>2sޞs63ZZ{@)߮4wW^iU>;#0Xo+aVbەk>hȭ_⑍MZϋe1:W(n+7|G֖KmG_#wc$Xnm2<i3 h>g}kaUJaYy1/]לr*MJ av񘩹Uu$_v{2H+h/TR AKU0'LF?ll.eKÄc檿 NsQ[Ue+m8 4C(4w"}V00rFaAba9ޫ|}C3fj#O[!=vV@gx/gڣ,⽕l_d|d؎ =La3K[&M}N7;iK 5n/ylT2dMK M#4C޺ILka Z7ByS<xC) *!ǭPSTQhp)-۠F7:Ou 2`à QY#T1 ,44Nu j a':Ô7RX(e(XX0|@[]VY"[^R税=זѨE}R3ztkMvV3Fɀn\SLU6#!*j Q!Eh09q 7 ; ˗*|wsK\nus_Q72VUW}&ue0}{ɟO0;1d%> >L2%X<.?%yͻ 'ϏvwoN.l{H:@D^~q{l&@C파iA\ 6^Ts+x63eAW:(_^fƸZUL"wƨ#QKS $z C),q-:lXEQxf+1t M.Gs;=ϬnpW UNA8ڧh1Etd,A|h򾥓$WV8yC%}mcj=ÀX(c-m`n\+0`BT[r(T`tMxyKalL]T Pcz|5!a(um#VY`,= -h#zu-ՕjV9f3j]ŗ (2;#TɑJL:8\N㤖@8X)=3Nqh(Z1Tļ2{J,7Ǿt+{]̈I! I|Qa +^=3H*u..I/ճ|ɂl͗[7l7H}gM~;.Xw=_ =kѽgNYӟQTSraK^ [2]Sز%_AoffRp=b>dI aG>`l {1 S3ƥrDx Vc1gsyن E|%J{9uEH;r>vc3-|磟>GP(X/]#=.-Yxp/V98)k^P+4cv2vϨb"È+1.PIO*Z9?<WL b "ċ;v |"Ko U-(Xr[Zp~ +}ml_lO)-۳img3G@O^CM`aSV"}]yuq+TޢH&ʵR$ @ .{+Y}{WD=dIY}L'h 8=˟ܢ6Ir/25tV&Ov,(p:;V -DWd [|Kpв3@ =,/º[ZpnڇD-z~IBc5>gjB;" :EdJx_DZzD#lW-@6)Ul=Z'?/UPY̨UJ]e8bw9-/雘|- %䐡GPe⯗L|ˏ>Si"uPeU<uVHWc~[v[6fCYIE^ 2ķ NtE5pF7 Jwq_;8A >eN@{8)_,#"NaH"YPf Pƅ:~wFn=yَ.5\mDd7vfr:!޸3&Z5Xv .KR5{/YΗ{ íV80m{2<.-03=Iҟb9˯X+jX v"&TUA'8<(*ٮuٮ\ҥE>@n&s[ЭhBs 𼒴KUVbDRӽT{"1MղQ,zR Lb,H큱h-fЊ!hkNrHwPBϽm[~҅Z@E8qlݜ#WyYY4 tzS\CR^鏗#h(@>~A|x)Arq_/֗7NN+[TOgd)?]\Cap.p{䬝p`v( 'U[]Ld䌪53h4G?]]XmW6=ɍ`~X<{/";uRԸӓߝδ&zwnmͅ=t@!dl(CTEy/^ba7z`fJ&E^9Z cMdӆ%j` ŏ1I! .|+L6S6c[`RQKxKr3L b Ϭ7l lR[`D"kxDv)j)Q% +T@Rva̭ ߼ej\JMtHWl0Qh^KCD.o1oۦŒudls(t֌ 0 I~e VDw:Ml4!;WJ`կi5PgIwdJ$6f§T'C̼JGD ;p-\8bK06Tl7cu/' aȍD}6A;J(sxU\3`ۼtxP! ,JUJJZ;SEO^.UdP^v;Ȓ[4dXĵB,nʓ1Q ܌ʆ.MgiBAk7c2K\PFép뚴qۧO&`fܳSCݓg`vm3#"@]zr<ǀqz3{I_d{L3&lu!?'4zCTL%N &1&<2t3̋u0K#% Foa0o8j cYmIK}]ms7+,}\-)4*}9ۗJƕ\TH~!% ɡ̐4q,znt !}!ZE]x1 I 23QW}Z,xJYa9҂~\"9G{ f2 p!(f'Y"32 rDAtO~uۮvV CkBJZ^M @V20-p,x4 Leѵ}Q /6Cj #|)A`D W`™3!CBh`oD`tĩW#$(Y!w^8[W:8A`w4Ⱦͩ̐ĭވpx@^v:OD2ZїJz(+S.T)t} A{P1%1[iYPsՊ\h؏KV s`UWTIT ǩW>zFn(!, V孲+тv KPN`c 6N:<5ޣ厂 !S -BZm\ƽtQ'm6fRU0xJV{HG4E?cDڤcDJ# 5Imk'ER*r hdIܽce`J&7$ oݠ@$w3A-g(%J8L|"MhD>gKwRBN2Ȟ茈lDTp@!YS*pR,B R!g>vPʤ}gjj:q |/i2ah—ziF8j.J$t< 5&7 [4&(/Sa,h1_>/%I Dd Ci̬:4#D](Bڲ.fRVQڼK[9T1MCSyLdDA'\ڭcNoTq;- Y9cf{u|)FvHoK(W&_(pumJHm [E~5Oe hĝ:B H}ic|"ohgRf\r^F={>cryy4ZK^.Wqhjk4D 7i l~a_PB>__Cnr!CCNpqrKx8ywy4wu`I܆QI _Kb@NG$e^Y 5mO墯Y yP$=rJ׋סyEI Rݛyvv8<+#?|wVFKGd|.G=fʦ}QVX9,vI5tdwL=rbEe[5z&N}pc&*M+wקVufW"-~L*e֟Bj.-hw(Jko_}h~7[}bolY;V$wf u $oW?8U+<Woy*-bf |e RX=zaŻ۞wܭ}HeDnul\?|J2J K/)tI}XT(<|J\!CH?ܒRuC{ҭip|޽R@?;ʣ>)>(yC^w=^ifKķixЃSWmAF+1jϔ)!h)M66Gl铕P>S +@ w/4Ȗ?sl6r"tl+\d{&x >/+?%~J./qcTBc,UWT)9= bњmBJrx#h=gm;m"꺹@\"^1G\ W^P0 u ,g[Y`y)݈@+*B, nOXvYCGbU/vB:݈%NTtĨq8M*ZO{Y15Ot黱j[;E, I Q\DQq !}DIW:DAN,&W 9 gm*VK)gM`n`mKtzu1}( 0u2t2aV'/2fHDj,aqھc1wip? #x4BW d>'_$"y{zSIrP(u y΂{s/a^r#<ȱjk2o|9=E>a9&S#+#r'V~v<:-체+79y^?ּLT93ٽL:T>6˲Z(;!4}@xh"<[]4tNϧ$FF#9X$y둠 8 R+#;S_͂S$ &x#$HmR4!'tX 'rk [.H$Sj,c)QEHjM\D|B ;H G I#+ y,'B]j hZN egș긱_g !$5{b%_EaGGjGNOJ8*Ps $*udpt(!h-8.b]Yȩ{M:DD}H`$FRpT _Њi(E!R;ܺLRlC_A ӊԮSpCA9@hd2(bhFH\EB(ن!2m)a0>ܔNV7O`7W_v.LhV/v^^oh^iuܖ; fV[w_vg};X- peRhցUWX3L岑[l2ɌpB*=& qi_[ GWYJFnMYܘ\Q)=W1>f)`X|Q~(󅛎?vv$?p1ƲeY:1Ĉ 9y93ŮCB\o޿Hw/'k"Kl[D*+eݍVT)g%v4"t7z-ԟ&Jc~M_ <),ߝ95e)8[(?$dB#V[4N{wTRuz5/(lxR~Y3sJTdu[ v I=u_o=흵$j}ZoըkfIٓ N١DQYM}:hD]ki]Q)=-7j_5}1GvgOL`֚sɏ2Nf׫K@7zt"ݣf:=7bh(l+IsSi6_l8/.+7y- m(Lާf3cXlW\/Jzncꔇ{xkPt/{bޞ)-f+[,p-2??r3ě QC:shPS-xZ%?gW;F{b3 良;3Et\,8wq4{SiZP]˞~A={(pV8b<+}|fWsnzN=Kj*|LӅ05=5jRHGV+rx''q0R(煦|( " sJPe1Jre: n,&bfZKJ[5nq>r- b6L ôF+yܖؕ/7{6 ɗdQO/e8_/Bb?mtl9Wpc8Iqu Keo;5}\S* -@8g Π/$ +LQVjf];kx5㦳D#dGhCy\]oGW} Qp{~B~ٌeI!)E&)iH΃eqSdƮ(MTsiNR2f㔢A}}XF. k|no&F'vjB <0YگNDj~٫&qnUCڍQݧ +8Ƀ[{!%G/Z¹0iVҹX!\]hnJ 4P{nDoۛ^lf]u/xl2dUuj_"Xnv {YN[I1`91F R/Wd6i;^bCHۏ+yՂ8/1_^ͅ0'n͹ hj4 `7y+;Qx$8G!yҦW^ QmŞWp~Hk~z*9 Jc.({Mo[hIe_mI/.z 8 ";N;twӫ&@d`2M+ _ Hu#B&?EZF6@ѹDK Yr0\rP,v^$WEΐ{E;O3K#%ќxPnGYqAsU@)oJ3E0 ݜjiqs8IhLYlY#Њs]PF]iJet2S 6k^F- z#N(fX?#)#AR(υn0}ۣ6.Dfx492=KdiDLh,C fMRI, Ԋи!dbcjf >,MQ7vp$t8Zr$> ˨BȘTQWzZd-qHCS7UC}ˌ j$xe ŅuWRiMK=,BKr^kA!uk,cDy .^sb RGgA2C9րB.ṗ@\t&z\W8?L K/2|yXb'hnGO,Ez,PooZcL^J'n1eճI^ޜP#cMs@HJ_pJUXMhh(_T!{*k=ojG^0C `~9s6kyD/tDifmÛt T˽j:0$zF>2K5%ԪӋޖ$F([i6@ѣ;Ke]=Pit_bk#N8AGE*b:9n.x_(%h;Q· 8޸ x:!L`:$g3ĽMN{a4&ᔆEbb %Qﭳ1$ /z\PN ȘUdՈ$Aǒ*(йUdw 1>5*L⣀Tx`s~ *q|sv/S|K.0\La!6d]LT>u n\k2PNz] Ճ8 Clo[^d'h]JNߍG]}\?>Sh5+9 =Ijxi^3"lΗFhr٣CLf(r>|`g߾X:Unم $-j̣&ƳV9lʒ;oSQ%Srh4&JˆC.@ǂS#z`4?@>VR'y)Eb"()eNm))%Dok$FmyXAhj!SZZFBU g\8:@Abox@b^=p#ǒmΠSRb':j&8Vznp㺫< տm!77h-(n ](h…fR{jy"Zb-\c9wd`o@ߖs嚩Mln8Iߖ-fV٧-,uZ~i&hc KZ8^= ^Dao9r#:E}*҉#6$ W"\Zw4:hmBTNk@1(S:L%qDgϖ:Pjk hkRn v>oYxU/\a M@1L@A4(3Ʃ7/dRXOύiw(<:Nn~SٴNQO f(* u?\_.O jn#PJJ Tj5oq4̭]3f Ҵ:y9|~?&|Tfן86ݲ)AbG;?[#zV&`QCN~m{eʱ ^^dLonՂG{ }`2\l@7ܘ;<ۦ{~gl9 D:m>j>b+3*?kWpPE~-g;;qMd缮Rd"0PW#칺+o㍒83umLg'^y(#L$1J IG&6aȪ%c]GX(ѳ8 A+J%Qƒϵ1YS8)O-eI;)@>L)ecvY)Z 9Qj_ȥ+ f3 eG*eB)e5.t7@Q<޺Mm3ALUOgYֺʟ6ص#yyscpQ)6eA̾|zu5O5.f={uIF'QqT 0˨.&QqҐ Qءbbun}f^WjblWet5Q^/]W;=,*kĨ^aa_޶kDi@0Hku򵴖ߟ"`1JA%{/MCj9f\8a!1]!Ol>zP' $)Sv+Gjx0_Ƈ^z0sq}G;S;C{7EVƬwв/RFб \HvZGQ)倧٧˧gh=] 4(%ԜvvyYC&eTVaasp#QTM'Vs3z4䲓ttL$ Z hS# ЛAT 4ǎ b >QHbDu>aO^z#ϽBYwr3T! 29fP{S}vvCMȟr^I 5{w;~\fzp'(X *@]Xڂ;t}vĞ^,3ݛWWc>u{ hB 9>ՙc@UZRiQ0zu_ZKX/حKYIRK ^% 5QMj;u!Si~n!vWU*<>P~9(z_iV^\+2ti@X#@,pEp)qpgo1Юa|飮~ hm+ח (B0/B3\A%*B⾐nWnٻ+Qۤ:(P:Z|rew.Q ǬE([|2N#E:"Z %k%g8}`ifL!*oTtu7B h=o cXv}a~/3k\P:LUaԔa4Z7_\췅c-4 }'X&I=9o ?NQ!5{4׬Ƿ\An~V68́y;Nr'WKb25 ,>xJؙ}>K6uϒ|WK^y+ME)f݃5KvXOFG4RecB݃lt`T_ԹsqﮅsB߸|_OyڼD>t1fxEsgp=>ugW50!NG{;zYYIO.d y9Ҡe]V&[oDj!$'^2v=Mj ڭ/ NXFaAxHBE[>ZIv`́X6x Cٍ>Z +vRRڱjZMzbt_?~|1'jX.?0lby49PA%޿WI9n[ĝ?^yP:A\nu)vC}OMKGTgk)tztx~dx tC0f}6qvYއ+\y]DZ5KKz1[2e$7JǙV(UXҺx: X0KM)"3=Phs"AߓsԔUb@zs[A n@=.~] 3PLku P]Y*O}@}eSvpW1i^V~zL1Qz!`^O9NH.uX'y!_ }v[l;m!_;! {0-5vT ۘ?Q֤ɯa]^-Q)QDd2dɐl 0cY d+lv7yDV+.7L^U{\3>{Nz7Ia+.d?^j0 Y/vc%l*=D7-2rqoݮXʡݰ'Fmt\o;&GWY6N]pA'O7=(Ur£P[őw(GsJ1`!e6֫8WVXf=T?dpwC}05elݽ8~nd4[=hGp9І(D5޾Ti#0fܼ.y'Ԍko RG_5 ~gdAW~:rJgջZsz 'գHX.nHϫsZ"x=knB吏8Dr,~7]愰zN vZi9F 3[mUZ|pRڭjrNJZJ.%ؽbU.(Kkc4By׼`ɇS3*9H-f=d_Zy;5Bw -6yGێS3d1uL5]N҃O}Ͳ@MB;)2ڊ ,ED#s,oOo`w%ߚŮrniwT$S_W2<X݊zO"\``==ZJ 8 qCP$ 2Q VX8=5 &Q|y ]m$bݤTZk[C=.@cTL'P%L eUY3,gh}Kud8xwOs 5WE87wNN엤8"j- J+B=BqtܦPFݠzߏbxdas[ X&G͔ r*EK-Zv<7lfaɻ;=z[}܈6n"\6:$8VΏ J"(x,дv邕o:U6>MG[U#'hZyJ&j)9TM!Q&Qoj22qI#[ iޥjlP|N dj+tB68dD~e.3DW)(CΨaL0 L ;>JNhQդ[Vg[?B4FXsݬy59zz`dAȢ!늓& sTxQI32 heZK)v[(G'Ò.OZDCq GwBHaZDT^ l)`B^E{ ~QV,-?E(|Bn8`D`u`([|ỡ4PbkJ!j^$m#D%d1"3b%b ѬN*D!%A֚E %qHTxS-Q6+yvXxLjOb/Q2ߠ0*)w9[N!Ԅ]KyP#!vGJs7K6'-'ij;-`TTkґ+VR6]t>:9\(Kp;> H^Y1A14L-Rp:"@+Z] j-j.v BhXO-[C)Jm-sP%עX2SJ9[|Db\?>__5z67{c (nS?,K 箔Rip9[TggW4TW{&\e qD*%߹սN hj ؟G5Ygfqt~+۱s[Ƕr]^o'cpP "Jr L)h+۞08>Vs'١2`.s1t%5ڭ3`w2ݱK~Jf#;’֚nAƵ^ElIvvP čy| E|+4*Bxy53д/̟?~oR_i:_},UvECoukQq|Ϯ;֋$`~SBZ9*6AߓM@l}Ms;wv|Kr.xw~|t4٧H& q¬̂r"׹߬m'ke텝,F |9?-ݛ,vb ~=vEOox/=ZG`Q-jMvZ+ăy1SaKܚxWC6KSi: w{`y>XhZ|$uR.F*R)٨RHr O{WH_w B%@tg4wv ܦN`Qe'v"Uvb:q*HJ"Ee% ZZn8h#BKv#Ș9Jn%£\rrS I2PcTlsn3UPn:x,yQ@ູTXp Ke%] -(t^Q|XD-ٌl,կ5X#V;hWyo&ͿެLݛ/Uv򭗍oP`98),!>pfZэzXl8We잛~~{ ODs>}8joјlBjB91Y2Pr|zumrfDw[0&8dY@n܌ƶ@"n';DԔP Q\DDj+P 0{{{kJ&ڑ|Xnzߐ]MN'Va =]~ ~OGJ þՇv2=Wz(Høkyl?7 ")tlOٴljZ>Bpuu9*ċE}oILp=0xWH̐Q1Vt~>= Rgtmy4w\EM桱n}ED?l,ֻόrOw)*^ ^8|TTE[:ǟ:>QjoKNTDx"[Lrn7(ZesHׅj Mj8{qu{QR\pB!U*}i_&i'XR&t/# nLzV=eg[E]u%,H˚▂r)9~ɛ^M>W Od4䅫hNInSoZ7 aRy:n#F_$Sni֭ y*SkuCx-!6⮇d:`^ hА-:%fgcerʜ^O_;L!3W/3BFxzwkMh>{Hb$VHDXuz;櫾l ܓRyGw*ֹ5Tt`x~=߶8gr`)X~4J;P]( `@ q-mUPYJڢ@pN E/_Oh5q}`Ơҕ ڇ6Ȉذ"U*BRv5{sGAo`W}h W$]tw0fKHr]w9Fb틠W +־zd=;Պ|jZj0 yߝ(`-} 1߃pM)Ê &=TN;bۈCڷuKoEhА):巐<ԈsúI qʃI}Guq%̺7[ n}h WBy^1|; &lߓ}l!#O^i[.Pk[נ鳬0ۮBe!!2*Hh^ z[)U(95Js纯Be}vPVbg _XBw ban$yupd9V~)txe8;qW2WfS ͓O ,vq}m/8Xf &;M{QKw .jY>*swrCZdU!Q4lȢkb8<:,WuK idŽA]Ld~sH/-DǕ4f雝e2 yE`',z܌KwU*?|xVɝ.|JgOEZi}Q絋'=ݠzU{ ݻ#2?d/nIbםξV_ tD#9|iX3,[uIAzV9-Ir7Y > XgQPh>vLOaH"iw/Netxcy&#IҀ#-z`~/ݼ QO{{D/E*Bi&=z,eɓ_$V,d%X$y =6U$[^so> p&un}Q̯ȺuPnk'o̮GdWI䮤f<%t7ZWբbe'9y'(]SlZ^9<5/GvF0:߷#\r [Zo@p~uA_~Z!.1Vh;USY5%SY<+ Ow.)n}^פ mk!7[cl竌H3ŞVTM'2a絽ʦ' ~ϷC$sELh_!;NK !D 漭:^@.}W]!?u6iqs1HiN VդwVY3gEϜ=jb~Nl5e\Ƃ%^֒ShoK^K ƖFi] %LL~qHA+Ys+E'U0h]';buj_/oTh>Ci_@n5 [zrqg䣘e3#ÂfyM)uEr[ 99#*+̤Qe>C9 ISla;rZKHKs$gڑ1ѶA9̬5xL=ㄸ1rH6HNH#;\\phmP32P$EsEԃ;d)Gɫn{hhy`tF Hv# om[S1X'aS, ufbpa; xpha蒌> C$*V^O8b(cj=#A㸩(|:0?Y!wrޜ]}w]ݴ; C0Ѷ^lE 8ܥ#*`8 3e+q=!+jSNkV[%gl#cC M5jх Ra`cjcT&5W-=j-'!RάûcV`+(X.mjY @XYj֢IOXИk1OTgum *kkkBK1o1Qem R*xV]lrJAQPd&5 C!rwz+X /GLvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000004034447315153664357017725 0ustar rootrootMar 10 00:06:24 crc systemd[1]: Starting Kubernetes Kubelet... Mar 10 00:06:24 crc restorecon[4706]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 10 00:06:26 crc kubenswrapper[4994]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 00:06:26 crc kubenswrapper[4994]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 10 00:06:26 crc kubenswrapper[4994]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 00:06:26 crc kubenswrapper[4994]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 00:06:26 crc kubenswrapper[4994]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 10 00:06:26 crc kubenswrapper[4994]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.274747 4994 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280228 4994 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280266 4994 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280288 4994 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280300 4994 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280312 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280323 4994 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280336 4994 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280346 4994 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280356 4994 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280369 4994 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280381 4994 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280392 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280401 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280410 4994 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280418 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280426 4994 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280434 4994 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280442 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280451 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280460 4994 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280497 4994 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280509 4994 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280517 4994 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280528 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280536 4994 feature_gate.go:330] unrecognized feature gate: Example Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280544 4994 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280552 4994 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280559 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280567 4994 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280575 4994 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280582 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280590 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280597 4994 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280605 4994 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280612 4994 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280620 4994 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280629 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280637 4994 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280646 4994 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280653 4994 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280661 4994 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280669 4994 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280677 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280685 4994 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280692 4994 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280700 4994 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280708 4994 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280717 4994 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280725 4994 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280733 4994 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280740 4994 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280748 4994 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280755 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280763 4994 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280770 4994 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280778 4994 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280786 4994 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280794 4994 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280801 4994 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280811 4994 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280818 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280826 4994 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280834 4994 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280841 4994 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280849 4994 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280858 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280865 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280910 4994 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280920 4994 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280929 4994 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280938 4994 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282633 4994 flags.go:64] FLAG: --address="0.0.0.0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282658 4994 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282678 4994 flags.go:64] FLAG: --anonymous-auth="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282689 4994 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282700 4994 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282710 4994 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282722 4994 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282734 4994 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282743 4994 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282752 4994 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282761 4994 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282772 4994 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282782 4994 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282790 4994 flags.go:64] FLAG: --cgroup-root="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282799 4994 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282808 4994 flags.go:64] FLAG: --client-ca-file="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282816 4994 flags.go:64] FLAG: --cloud-config="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282825 4994 flags.go:64] FLAG: --cloud-provider="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282833 4994 flags.go:64] FLAG: --cluster-dns="[]" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282843 4994 flags.go:64] FLAG: --cluster-domain="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282852 4994 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282861 4994 flags.go:64] FLAG: --config-dir="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282907 4994 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282920 4994 flags.go:64] FLAG: --container-log-max-files="5" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282931 4994 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282941 4994 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282951 4994 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282961 4994 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282970 4994 flags.go:64] FLAG: --contention-profiling="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282979 4994 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282990 4994 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283002 4994 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283010 4994 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283021 4994 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283030 4994 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283039 4994 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283047 4994 flags.go:64] FLAG: --enable-load-reader="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283056 4994 flags.go:64] FLAG: --enable-server="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283065 4994 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283077 4994 flags.go:64] FLAG: --event-burst="100" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283086 4994 flags.go:64] FLAG: --event-qps="50" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283095 4994 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283104 4994 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283113 4994 flags.go:64] FLAG: --eviction-hard="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283124 4994 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283133 4994 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283142 4994 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283151 4994 flags.go:64] FLAG: --eviction-soft="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283160 4994 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283169 4994 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283178 4994 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283186 4994 flags.go:64] FLAG: --experimental-mounter-path="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283195 4994 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283204 4994 flags.go:64] FLAG: --fail-swap-on="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283213 4994 flags.go:64] FLAG: --feature-gates="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283223 4994 flags.go:64] FLAG: --file-check-frequency="20s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283232 4994 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283241 4994 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283250 4994 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283259 4994 flags.go:64] FLAG: --healthz-port="10248" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283268 4994 flags.go:64] FLAG: --help="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283276 4994 flags.go:64] FLAG: --hostname-override="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283286 4994 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283301 4994 flags.go:64] FLAG: --http-check-frequency="20s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283313 4994 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283324 4994 flags.go:64] FLAG: --image-credential-provider-config="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283336 4994 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283348 4994 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283357 4994 flags.go:64] FLAG: --image-service-endpoint="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283365 4994 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283375 4994 flags.go:64] FLAG: --kube-api-burst="100" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283384 4994 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283393 4994 flags.go:64] FLAG: --kube-api-qps="50" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283402 4994 flags.go:64] FLAG: --kube-reserved="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283410 4994 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283420 4994 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283429 4994 flags.go:64] FLAG: --kubelet-cgroups="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283438 4994 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283446 4994 flags.go:64] FLAG: --lock-file="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283455 4994 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283464 4994 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283472 4994 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283486 4994 flags.go:64] FLAG: --log-json-split-stream="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283495 4994 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283503 4994 flags.go:64] FLAG: --log-text-split-stream="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283512 4994 flags.go:64] FLAG: --logging-format="text" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283522 4994 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283532 4994 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283540 4994 flags.go:64] FLAG: --manifest-url="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283549 4994 flags.go:64] FLAG: --manifest-url-header="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283561 4994 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283570 4994 flags.go:64] FLAG: --max-open-files="1000000" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283580 4994 flags.go:64] FLAG: --max-pods="110" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283589 4994 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283598 4994 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283608 4994 flags.go:64] FLAG: --memory-manager-policy="None" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283617 4994 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283626 4994 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283635 4994 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283643 4994 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283665 4994 flags.go:64] FLAG: --node-status-max-images="50" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283674 4994 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283685 4994 flags.go:64] FLAG: --oom-score-adj="-999" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283695 4994 flags.go:64] FLAG: --pod-cidr="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283703 4994 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283716 4994 flags.go:64] FLAG: --pod-manifest-path="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283725 4994 flags.go:64] FLAG: --pod-max-pids="-1" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283734 4994 flags.go:64] FLAG: --pods-per-core="0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283744 4994 flags.go:64] FLAG: --port="10250" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283753 4994 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283761 4994 flags.go:64] FLAG: --provider-id="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283770 4994 flags.go:64] FLAG: --qos-reserved="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283779 4994 flags.go:64] FLAG: --read-only-port="10255" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283788 4994 flags.go:64] FLAG: --register-node="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283797 4994 flags.go:64] FLAG: --register-schedulable="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283815 4994 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283830 4994 flags.go:64] FLAG: --registry-burst="10" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283839 4994 flags.go:64] FLAG: --registry-qps="5" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283848 4994 flags.go:64] FLAG: --reserved-cpus="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283857 4994 flags.go:64] FLAG: --reserved-memory="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283868 4994 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283906 4994 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283916 4994 flags.go:64] FLAG: --rotate-certificates="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283925 4994 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283935 4994 flags.go:64] FLAG: --runonce="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283943 4994 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283954 4994 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283963 4994 flags.go:64] FLAG: --seccomp-default="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283973 4994 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283982 4994 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283992 4994 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284001 4994 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284010 4994 flags.go:64] FLAG: --storage-driver-password="root" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284019 4994 flags.go:64] FLAG: --storage-driver-secure="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284029 4994 flags.go:64] FLAG: --storage-driver-table="stats" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284037 4994 flags.go:64] FLAG: --storage-driver-user="root" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284046 4994 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284056 4994 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284067 4994 flags.go:64] FLAG: --system-cgroups="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284076 4994 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284090 4994 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284100 4994 flags.go:64] FLAG: --tls-cert-file="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284109 4994 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284119 4994 flags.go:64] FLAG: --tls-min-version="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284129 4994 flags.go:64] FLAG: --tls-private-key-file="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284137 4994 flags.go:64] FLAG: --topology-manager-policy="none" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284146 4994 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284155 4994 flags.go:64] FLAG: --topology-manager-scope="container" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284164 4994 flags.go:64] FLAG: --v="2" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284176 4994 flags.go:64] FLAG: --version="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284187 4994 flags.go:64] FLAG: --vmodule="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284198 4994 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284208 4994 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284428 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284441 4994 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284449 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284458 4994 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284497 4994 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284506 4994 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284514 4994 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284522 4994 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284530 4994 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284537 4994 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284545 4994 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284553 4994 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284563 4994 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284574 4994 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284582 4994 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284590 4994 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284599 4994 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284606 4994 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284614 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284621 4994 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284629 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284638 4994 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284648 4994 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284658 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284667 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284675 4994 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284684 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284692 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284699 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284708 4994 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284716 4994 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284724 4994 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284731 4994 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284739 4994 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284749 4994 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284758 4994 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284766 4994 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284777 4994 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284788 4994 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284802 4994 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284812 4994 feature_gate.go:330] unrecognized feature gate: Example Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284824 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284834 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284843 4994 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284853 4994 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284863 4994 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284904 4994 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284916 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284926 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284936 4994 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284945 4994 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284955 4994 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284965 4994 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284975 4994 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284985 4994 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284995 4994 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285004 4994 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285015 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285025 4994 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285036 4994 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285046 4994 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285057 4994 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285067 4994 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285078 4994 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285088 4994 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285097 4994 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285106 4994 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285114 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285121 4994 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285129 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285137 4994 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.285995 4994 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.300560 4994 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.300982 4994 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301135 4994 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301151 4994 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301161 4994 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301171 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301181 4994 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301193 4994 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301203 4994 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301212 4994 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301221 4994 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301230 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301240 4994 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301249 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301258 4994 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301267 4994 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301276 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301285 4994 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301293 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301302 4994 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301313 4994 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301327 4994 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301338 4994 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301348 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301357 4994 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301367 4994 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301376 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301384 4994 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301395 4994 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301405 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301415 4994 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301424 4994 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301433 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301441 4994 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301454 4994 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301464 4994 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301473 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301481 4994 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301490 4994 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301498 4994 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301511 4994 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301522 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301535 4994 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301544 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301554 4994 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301563 4994 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301572 4994 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301581 4994 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301590 4994 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301599 4994 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301608 4994 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301616 4994 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301624 4994 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301633 4994 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301641 4994 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301650 4994 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301658 4994 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301666 4994 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301675 4994 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301683 4994 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301692 4994 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301701 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301710 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301719 4994 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301727 4994 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301736 4994 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301745 4994 feature_gate.go:330] unrecognized feature gate: Example Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301754 4994 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301762 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301771 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301780 4994 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301788 4994 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301800 4994 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.301816 4994 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302161 4994 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302178 4994 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302188 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302197 4994 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302206 4994 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302214 4994 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302224 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302232 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302241 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302249 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302258 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302267 4994 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302276 4994 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302284 4994 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302293 4994 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302301 4994 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302310 4994 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302318 4994 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302327 4994 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302336 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302344 4994 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302353 4994 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302361 4994 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302370 4994 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302378 4994 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302386 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302395 4994 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302404 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302414 4994 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302435 4994 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302444 4994 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302454 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302463 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302472 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302483 4994 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302493 4994 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302502 4994 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302512 4994 feature_gate.go:330] unrecognized feature gate: Example Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302520 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302529 4994 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302538 4994 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302546 4994 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302557 4994 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302565 4994 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302577 4994 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302587 4994 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302595 4994 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302604 4994 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302612 4994 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302621 4994 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302629 4994 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302637 4994 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302645 4994 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302654 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302662 4994 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302671 4994 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302679 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302687 4994 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302696 4994 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302705 4994 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302713 4994 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302721 4994 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302730 4994 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302742 4994 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302752 4994 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302761 4994 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302771 4994 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302781 4994 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302790 4994 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302799 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302807 4994 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.302821 4994 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.303905 4994 server.go:940] "Client rotation is on, will bootstrap in background" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.308347 4994 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.315306 4994 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.315483 4994 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.319936 4994 server.go:997] "Starting client certificate rotation" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.319986 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.321034 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.346617 4994 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.349600 4994 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.349704 4994 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.373284 4994 log.go:25] "Validated CRI v1 runtime API" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.415018 4994 log.go:25] "Validated CRI v1 image API" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.417705 4994 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.423548 4994 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-10-00-01-09-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.423605 4994 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.456632 4994 manager.go:217] Machine: {Timestamp:2026-03-10 00:06:26.453614174 +0000 UTC m=+0.627320973 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:c9a6b1d9-12bb-4e1d-8072-25b4f73868f8 BootID:9894519f-677e-4b1e-80a1-f7e7d58a0619 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:98:67:e2 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:98:67:e2 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:4f:02:a8 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b1:e2:a6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:87:4f:5b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:70:d0:db Speed:-1 Mtu:1496} {Name:eth10 MacAddress:f2:2f:57:75:3c:d9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1e:40:c2:aa:a8:34 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.457098 4994 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.457298 4994 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.457943 4994 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.458262 4994 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.458333 4994 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.458668 4994 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.458687 4994 container_manager_linux.go:303] "Creating device plugin manager" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.459378 4994 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.459431 4994 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.460203 4994 state_mem.go:36] "Initialized new in-memory state store" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.460359 4994 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.464439 4994 kubelet.go:418] "Attempting to sync node with API server" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.464479 4994 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.464522 4994 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.464545 4994 kubelet.go:324] "Adding apiserver pod source" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.464563 4994 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.468954 4994 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.469574 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.469681 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.470115 4994 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.470350 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.470482 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.472581 4994 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474515 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474558 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474574 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474589 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474612 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474625 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474639 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474660 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474676 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474691 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474710 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474724 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.477073 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.477788 4994 server.go:1280] "Started kubelet" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.479177 4994 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.479133 4994 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 00:06:26 crc systemd[1]: Started Kubernetes Kubelet. Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.480038 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.480736 4994 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.482203 4994 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.482292 4994 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.482606 4994 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.482639 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.482653 4994 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.482669 4994 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.483431 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.483523 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.483590 4994 server.go:460] "Adding debug handlers to kubelet server" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.483926 4994 factory.go:55] Registering systemd factory Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.483969 4994 factory.go:221] Registration of the systemd container factory successfully Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.484299 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="200ms" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.484348 4994 factory.go:153] Registering CRI-O factory Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.484414 4994 factory.go:221] Registration of the crio container factory successfully Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.484529 4994 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.484568 4994 factory.go:103] Registering Raw factory Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.484599 4994 manager.go:1196] Started watching for new ooms in manager Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.485830 4994 manager.go:319] Starting recovery of all containers Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.486184 4994 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b522282d4604f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,LastTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503634 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503729 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503762 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503792 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503819 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503844 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503868 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503935 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503964 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504002 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504025 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504050 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504083 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504119 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504143 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504181 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504208 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504234 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504260 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504287 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504318 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504350 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504406 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504433 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504458 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504483 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504583 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504614 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504638 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504674 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504702 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504742 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504766 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504792 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504818 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504844 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504924 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504957 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504985 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505013 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505039 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505067 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505093 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505118 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505146 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505171 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505201 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505232 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505260 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505287 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505315 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505340 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505374 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505400 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505429 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505457 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505483 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505510 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505535 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505562 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505588 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505700 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505732 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505776 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505804 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507520 4994 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507592 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507630 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507660 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507692 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507744 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507774 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507805 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507833 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507861 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507929 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507961 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507990 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508018 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508087 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508116 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508139 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508164 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508192 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508222 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508249 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508277 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508305 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508330 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508394 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508425 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508452 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508479 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508504 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508530 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508558 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508587 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508615 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508643 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508668 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508694 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508722 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508749 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508794 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508823 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508864 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508931 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508969 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508999 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509029 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509060 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509094 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509122 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509148 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509179 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509206 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509231 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509259 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509284 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509309 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509334 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509362 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509394 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509420 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509446 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509473 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509500 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509534 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509565 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509592 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509621 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509662 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509692 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509719 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509748 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509776 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509805 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509831 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509859 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509931 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509961 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509989 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510020 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510050 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510074 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510104 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510131 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510156 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510185 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510210 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510237 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510264 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510289 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510315 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510344 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510369 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510396 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510435 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510464 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510493 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510519 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510544 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510563 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510582 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510601 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510620 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510643 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510662 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510680 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510698 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510753 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510778 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510796 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510816 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510835 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510861 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510926 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510952 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510972 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510993 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511013 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511032 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511052 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511079 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511105 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511130 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511155 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511176 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511196 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511218 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511238 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511259 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511279 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511302 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511327 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511354 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511380 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511400 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511419 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511443 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511471 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511498 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511527 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511552 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511574 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511594 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511614 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511635 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511654 4994 reconstruct.go:97] "Volume reconstruction finished" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511667 4994 reconciler.go:26] "Reconciler: start to sync state" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.520414 4994 manager.go:324] Recovery completed Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.538090 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.540727 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.540781 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.540800 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.542280 4994 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.542311 4994 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.542347 4994 state_mem.go:36] "Initialized new in-memory state store" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.549550 4994 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.552620 4994 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.552679 4994 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.552712 4994 kubelet.go:2335] "Starting kubelet main sync loop" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.552783 4994 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.555464 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.555558 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.565110 4994 policy_none.go:49] "None policy: Start" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.566551 4994 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.566599 4994 state_mem.go:35] "Initializing new in-memory state store" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.583393 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.628174 4994 manager.go:334] "Starting Device Plugin manager" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.628246 4994 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.628268 4994 server.go:79] "Starting device plugin registration server" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.628834 4994 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.628854 4994 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.629190 4994 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.629305 4994 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.629319 4994 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.640664 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.652921 4994 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.653048 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.654400 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.654457 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.654470 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.654716 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.655082 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.655140 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.658539 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.658668 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.658690 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.658818 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.658925 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.658940 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.659692 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.659910 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.659980 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.660998 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661041 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661060 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661182 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661210 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661227 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661240 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661462 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661541 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662450 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662492 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662511 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662658 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662686 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662791 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662837 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662687 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662942 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664122 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664174 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664191 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664220 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664265 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664303 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664557 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664615 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.665904 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.665944 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.665961 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.685865 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="400ms" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715451 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715557 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715596 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715631 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715674 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715734 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715809 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715913 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715953 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715997 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.716025 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.716055 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.716084 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.716113 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.716142 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.728993 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.730240 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.730298 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.730320 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.730357 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.731026 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817501 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817560 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817601 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817630 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817657 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817696 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817726 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817757 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817765 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817791 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817819 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817826 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817856 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817854 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817934 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817948 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817911 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817977 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817996 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817973 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818017 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818022 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818046 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818023 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818071 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818086 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818052 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818134 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818195 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818269 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.931323 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.932837 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.932942 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.932963 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.933042 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.933528 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.999557 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.018225 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.032186 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.057739 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.059306 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-9ccfe79d4dfe6da753fc95167b67ebc134a794926f1a11929e3a732af141d1a4 WatchSource:0}: Error finding container 9ccfe79d4dfe6da753fc95167b67ebc134a794926f1a11929e3a732af141d1a4: Status 404 returned error can't find the container with id 9ccfe79d4dfe6da753fc95167b67ebc134a794926f1a11929e3a732af141d1a4 Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.061245 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-6858f9b978a8d205cbc7547b7638cc779b2799ed14a8bc8a7d70dff467b6f893 WatchSource:0}: Error finding container 6858f9b978a8d205cbc7547b7638cc779b2799ed14a8bc8a7d70dff467b6f893: Status 404 returned error can't find the container with id 6858f9b978a8d205cbc7547b7638cc779b2799ed14a8bc8a7d70dff467b6f893 Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.069446 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.079939 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-6c46f93fa4d73cf6270f364f11272df6b45c18596808880a68c78370736f59ee WatchSource:0}: Error finding container 6c46f93fa4d73cf6270f364f11272df6b45c18596808880a68c78370736f59ee: Status 404 returned error can't find the container with id 6c46f93fa4d73cf6270f364f11272df6b45c18596808880a68c78370736f59ee Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.086994 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="800ms" Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.088032 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-28b6d77d1900211df3b731b1343563247180870acb2c2899bfb179c7793a50bd WatchSource:0}: Error finding container 28b6d77d1900211df3b731b1343563247180870acb2c2899bfb179c7793a50bd: Status 404 returned error can't find the container with id 28b6d77d1900211df3b731b1343563247180870acb2c2899bfb179c7793a50bd Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.303000 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.303087 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.334048 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.335616 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.335652 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.335662 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.335686 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.336058 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.481142 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.510051 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.510123 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.558513 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9ccfe79d4dfe6da753fc95167b67ebc134a794926f1a11929e3a732af141d1a4"} Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.559502 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"28b6d77d1900211df3b731b1343563247180870acb2c2899bfb179c7793a50bd"} Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.560574 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6c46f93fa4d73cf6270f364f11272df6b45c18596808880a68c78370736f59ee"} Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.561605 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c3df1cdfdff2ff0cff1b5236b34ebc262f2d8ab395878986e7ee06a83ed10c0c"} Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.562816 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6858f9b978a8d205cbc7547b7638cc779b2799ed14a8bc8a7d70dff467b6f893"} Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.700498 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.700637 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.803447 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.803562 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.888150 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="1.6s" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.136863 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.138645 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.138715 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.138738 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.138776 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:28 crc kubenswrapper[4994]: E0310 00:06:28.139298 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.433287 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:06:28 crc kubenswrapper[4994]: E0310 00:06:28.434949 4994 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.481219 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.568933 4994 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a" exitCode=0 Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.569068 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.569121 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.572070 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.572130 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.572140 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.572164 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.572175 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.572195 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.574769 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80" exitCode=0 Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.574834 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.574994 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.576099 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.576146 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.576166 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.576746 4994 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7" exitCode=0 Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.576832 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.576988 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.578040 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.578944 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.578967 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.578979 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.579105 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.579153 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.579178 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.581248 4994 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99" exitCode=0 Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.581306 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.581396 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.583379 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.583431 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.583449 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4994]: W0310 00:06:29.243049 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:29 crc kubenswrapper[4994]: E0310 00:06:29.243375 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:29 crc kubenswrapper[4994]: W0310 00:06:29.391481 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:29 crc kubenswrapper[4994]: E0310 00:06:29.391557 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.480767 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:29 crc kubenswrapper[4994]: E0310 00:06:29.489514 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="3.2s" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.587214 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.587272 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.587291 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.587326 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.588435 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.588472 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.588485 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.592491 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.592606 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.593573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.593600 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.593612 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.596425 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.596458 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.596477 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.596505 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.598241 4994 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba" exitCode=0 Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.598290 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.598420 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.599420 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.599453 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.599464 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.600478 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.600566 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.601382 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.601425 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.601439 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.739615 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.741057 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.741100 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.741110 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.741136 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:29 crc kubenswrapper[4994]: E0310 00:06:29.741603 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.799164 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.809434 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.607060 4994 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010" exitCode=0 Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.607154 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010"} Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.607240 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.608814 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.608863 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.608909 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.612674 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a172cab5caa680bc0a998d64ce1c15a99e8b1d8705c147b732f9861c19ad8d13"} Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.612753 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.612808 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.612924 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.612976 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.613113 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614353 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614439 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614460 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614481 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614525 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614546 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614861 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614902 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614916 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.615044 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.615083 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.615101 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621777 4994 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621827 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621833 4994 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621843 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7"} Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621927 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621937 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621955 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176"} Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621984 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637"} Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623132 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623198 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623217 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623583 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623633 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623651 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623748 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623774 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623785 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.110524 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.509736 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.630273 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9"} Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.630343 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04"} Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.630373 4994 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.630406 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.630443 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.632105 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.632161 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.632179 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.632400 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.632468 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.632492 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.821507 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.942005 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.944468 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.944525 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.944542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.944579 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.633000 4994 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.633069 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.633015 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.634829 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.634865 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.634900 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.634971 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.634930 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.635008 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.785036 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.785323 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.786973 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.787030 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.787052 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.837275 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:34 crc kubenswrapper[4994]: I0310 00:06:34.635686 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:34 crc kubenswrapper[4994]: I0310 00:06:34.636847 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:34 crc kubenswrapper[4994]: I0310 00:06:34.636957 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:34 crc kubenswrapper[4994]: I0310 00:06:34.636984 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.389939 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.390200 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.391900 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.391935 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.391949 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.925810 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.926107 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.929689 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.929756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.929782 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:36 crc kubenswrapper[4994]: I0310 00:06:36.292915 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:36 crc kubenswrapper[4994]: I0310 00:06:36.293137 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:36 crc kubenswrapper[4994]: I0310 00:06:36.294812 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:36 crc kubenswrapper[4994]: I0310 00:06:36.294867 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:36 crc kubenswrapper[4994]: I0310 00:06:36.294923 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:36 crc kubenswrapper[4994]: E0310 00:06:36.640851 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:38 crc kubenswrapper[4994]: I0310 00:06:38.389863 4994 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:06:38 crc kubenswrapper[4994]: I0310 00:06:38.390005 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 00:06:40 crc kubenswrapper[4994]: W0310 00:06:40.242716 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.242925 4994 trace.go:236] Trace[475414176]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 00:06:30.241) (total time: 10001ms): Mar 10 00:06:40 crc kubenswrapper[4994]: Trace[475414176]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:06:40.242) Mar 10 00:06:40 crc kubenswrapper[4994]: Trace[475414176]: [10.001821248s] [10.001821248s] END Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.242970 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 00:06:40 crc kubenswrapper[4994]: W0310 00:06:40.327276 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.327374 4994 trace.go:236] Trace[984513433]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 00:06:30.326) (total time: 10001ms): Mar 10 00:06:40 crc kubenswrapper[4994]: Trace[984513433]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:06:40.327) Mar 10 00:06:40 crc kubenswrapper[4994]: Trace[984513433]: [10.001139388s] [10.001139388s] END Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.327397 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.481612 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.826542 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 10 00:06:40 crc kubenswrapper[4994]: W0310 00:06:40.827463 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.827528 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.831459 4994 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b522282d4604f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,LastTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:40 crc kubenswrapper[4994]: W0310 00:06:40.835311 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.835404 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.838095 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.843157 4994 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.843251 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.848830 4994 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.848910 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.858836 4994 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.485355 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:41Z is after 2026-02-23T05:33:13Z Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.658016 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.660782 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a172cab5caa680bc0a998d64ce1c15a99e8b1d8705c147b732f9861c19ad8d13" exitCode=255 Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.660850 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a172cab5caa680bc0a998d64ce1c15a99e8b1d8705c147b732f9861c19ad8d13"} Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.661111 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.662321 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.662365 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.662384 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.663188 4994 scope.go:117] "RemoveContainer" containerID="a172cab5caa680bc0a998d64ce1c15a99e8b1d8705c147b732f9861c19ad8d13" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.180328 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.180601 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.181744 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.181781 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.181823 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.237211 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.486462 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:42Z is after 2026-02-23T05:33:13Z Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.666566 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.668691 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.669336 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086"} Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.669440 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.670258 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.670284 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.670295 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.670988 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.671010 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.671019 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.684948 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.829916 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.486567 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:43Z is after 2026-02-23T05:33:13Z Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.678069 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.678736 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.681597 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086" exitCode=255 Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.681851 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.682067 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086"} Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.682195 4994 scope.go:117] "RemoveContainer" containerID="a172cab5caa680bc0a998d64ce1c15a99e8b1d8705c147b732f9861c19ad8d13" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.682282 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.683614 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.683770 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.683806 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.683775 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.683964 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.683994 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.685112 4994 scope.go:117] "RemoveContainer" containerID="f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086" Mar 10 00:06:43 crc kubenswrapper[4994]: E0310 00:06:43.685432 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.687017 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.791502 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.791706 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.793238 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.793299 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.793320 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.838159 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.486069 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:44Z is after 2026-02-23T05:33:13Z Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.689555 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.693936 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.695022 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.695069 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.695086 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.695802 4994 scope.go:117] "RemoveContainer" containerID="f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086" Mar 10 00:06:44 crc kubenswrapper[4994]: E0310 00:06:44.696104 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:45 crc kubenswrapper[4994]: W0310 00:06:45.009068 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:45Z is after 2026-02-23T05:33:13Z Mar 10 00:06:45 crc kubenswrapper[4994]: E0310 00:06:45.009187 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:45 crc kubenswrapper[4994]: I0310 00:06:45.486317 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:45Z is after 2026-02-23T05:33:13Z Mar 10 00:06:45 crc kubenswrapper[4994]: W0310 00:06:45.671800 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:45Z is after 2026-02-23T05:33:13Z Mar 10 00:06:45 crc kubenswrapper[4994]: E0310 00:06:45.671925 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:45 crc kubenswrapper[4994]: I0310 00:06:45.697182 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:45 crc kubenswrapper[4994]: I0310 00:06:45.698598 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:45 crc kubenswrapper[4994]: I0310 00:06:45.698650 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:45 crc kubenswrapper[4994]: I0310 00:06:45.698669 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:45 crc kubenswrapper[4994]: I0310 00:06:45.699576 4994 scope.go:117] "RemoveContainer" containerID="f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086" Mar 10 00:06:45 crc kubenswrapper[4994]: E0310 00:06:45.699864 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:46 crc kubenswrapper[4994]: I0310 00:06:46.484639 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:46Z is after 2026-02-23T05:33:13Z Mar 10 00:06:46 crc kubenswrapper[4994]: E0310 00:06:46.641147 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:47 crc kubenswrapper[4994]: E0310 00:06:47.232471 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:47Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 00:06:47 crc kubenswrapper[4994]: I0310 00:06:47.238607 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:47 crc kubenswrapper[4994]: I0310 00:06:47.240447 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:47 crc kubenswrapper[4994]: I0310 00:06:47.240528 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:47 crc kubenswrapper[4994]: I0310 00:06:47.240548 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:47 crc kubenswrapper[4994]: I0310 00:06:47.240581 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:47 crc kubenswrapper[4994]: E0310 00:06:47.245613 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:47Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 00:06:47 crc kubenswrapper[4994]: I0310 00:06:47.484674 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:47Z is after 2026-02-23T05:33:13Z Mar 10 00:06:48 crc kubenswrapper[4994]: I0310 00:06:48.391114 4994 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:06:48 crc kubenswrapper[4994]: I0310 00:06:48.391770 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:06:48 crc kubenswrapper[4994]: I0310 00:06:48.486077 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:48Z is after 2026-02-23T05:33:13Z Mar 10 00:06:48 crc kubenswrapper[4994]: I0310 00:06:48.863409 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:06:48 crc kubenswrapper[4994]: E0310 00:06:48.869386 4994 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:49 crc kubenswrapper[4994]: I0310 00:06:49.486547 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:49Z is after 2026-02-23T05:33:13Z Mar 10 00:06:50 crc kubenswrapper[4994]: I0310 00:06:50.485433 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:50Z is after 2026-02-23T05:33:13Z Mar 10 00:06:50 crc kubenswrapper[4994]: E0310 00:06:50.837070 4994 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b522282d4604f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,LastTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:51 crc kubenswrapper[4994]: W0310 00:06:51.198981 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:51Z is after 2026-02-23T05:33:13Z Mar 10 00:06:51 crc kubenswrapper[4994]: E0310 00:06:51.199082 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:51 crc kubenswrapper[4994]: I0310 00:06:51.485353 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:51Z is after 2026-02-23T05:33:13Z Mar 10 00:06:51 crc kubenswrapper[4994]: W0310 00:06:51.814407 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:51Z is after 2026-02-23T05:33:13Z Mar 10 00:06:51 crc kubenswrapper[4994]: E0310 00:06:51.814501 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:52 crc kubenswrapper[4994]: I0310 00:06:52.486142 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:52Z is after 2026-02-23T05:33:13Z Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.271076 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.271361 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.273012 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.273155 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.273176 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.274013 4994 scope.go:117] "RemoveContainer" containerID="f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.485650 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:53Z is after 2026-02-23T05:33:13Z Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.722634 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.725541 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118"} Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.725713 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.726443 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.726497 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.726515 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.837575 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:54 crc kubenswrapper[4994]: E0310 00:06:54.237588 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:54Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.245756 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.247150 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.247212 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.247235 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.247276 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:54 crc kubenswrapper[4994]: E0310 00:06:54.252240 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:54Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 00:06:54 crc kubenswrapper[4994]: W0310 00:06:54.456152 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:54Z is after 2026-02-23T05:33:13Z Mar 10 00:06:54 crc kubenswrapper[4994]: E0310 00:06:54.456245 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.486308 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:54Z is after 2026-02-23T05:33:13Z Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.730132 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.731070 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.733830 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118" exitCode=255 Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.733925 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118"} Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.733998 4994 scope.go:117] "RemoveContainer" containerID="f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.734110 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.735523 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.735571 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.735588 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.736657 4994 scope.go:117] "RemoveContainer" containerID="ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118" Mar 10 00:06:54 crc kubenswrapper[4994]: E0310 00:06:54.737009 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.485755 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:55Z is after 2026-02-23T05:33:13Z Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.739577 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.742284 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.743356 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.743427 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.743487 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.744263 4994 scope.go:117] "RemoveContainer" containerID="ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118" Mar 10 00:06:55 crc kubenswrapper[4994]: E0310 00:06:55.744572 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:55 crc kubenswrapper[4994]: W0310 00:06:55.878337 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:55Z is after 2026-02-23T05:33:13Z Mar 10 00:06:55 crc kubenswrapper[4994]: E0310 00:06:55.878442 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:56 crc kubenswrapper[4994]: I0310 00:06:56.485542 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:56Z is after 2026-02-23T05:33:13Z Mar 10 00:06:56 crc kubenswrapper[4994]: E0310 00:06:56.641397 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:57 crc kubenswrapper[4994]: I0310 00:06:57.485616 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:57Z is after 2026-02-23T05:33:13Z Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.391233 4994 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.391324 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.391404 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.391607 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.393037 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.393096 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.393123 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.393836 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.394120 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1" gracePeriod=30 Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.484814 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.753433 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.754585 4994 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1" exitCode=255 Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.754801 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1"} Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.755025 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406"} Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.755277 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.756535 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.756585 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.756603 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:59 crc kubenswrapper[4994]: I0310 00:06:59.487700 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:59 crc kubenswrapper[4994]: I0310 00:06:59.758054 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:59 crc kubenswrapper[4994]: I0310 00:06:59.759369 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:59 crc kubenswrapper[4994]: I0310 00:06:59.759428 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:59 crc kubenswrapper[4994]: I0310 00:06:59.759447 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:00 crc kubenswrapper[4994]: I0310 00:07:00.489099 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.846924 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522282d4604f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,LastTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.854538 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.860857 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.867966 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.875713 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228c094ff4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.632208372 +0000 UTC m=+0.805915161,LastTimestamp:2026-03-10 00:06:26.632208372 +0000 UTC m=+0.805915161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.883910 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.654443816 +0000 UTC m=+0.828150585,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.887490 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.654464804 +0000 UTC m=+0.828171563,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.891251 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228696b29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.654476809 +0000 UTC m=+0.828183568,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.893930 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.658586995 +0000 UTC m=+0.832293784,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.897992 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.658682372 +0000 UTC m=+0.832389161,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.903482 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228696b29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.658701019 +0000 UTC m=+0.832407808,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.910150 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.658915849 +0000 UTC m=+0.832622608,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.916864 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.658935186 +0000 UTC m=+0.832641945,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.923561 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228696b29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.65894662 +0000 UTC m=+0.832653379,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.930389 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.661032109 +0000 UTC m=+0.834738898,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.937402 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.661053297 +0000 UTC m=+0.834760086,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.943926 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228696b29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.661070794 +0000 UTC m=+0.834777583,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.950417 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.661201743 +0000 UTC m=+0.834908512,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.957765 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.66122043 +0000 UTC m=+0.834927189,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.964756 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228696b29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.661235215 +0000 UTC m=+0.834941974,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.971387 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.662481142 +0000 UTC m=+0.836187931,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.975635 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.66250433 +0000 UTC m=+0.836211119,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.980716 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228696b29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.662521377 +0000 UTC m=+0.836228166,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.987672 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.662678796 +0000 UTC m=+0.836385595,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.989264 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.662910512 +0000 UTC m=+0.836617291,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.994866 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5222a603760a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.068032522 +0000 UTC m=+1.241739311,LastTimestamp:2026-03-10 00:06:27.068032522 +0000 UTC m=+1.241739311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.000659 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b5222a603d8ab openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.068057771 +0000 UTC m=+1.241764560,LastTimestamp:2026-03-10 00:06:27.068057771 +0000 UTC m=+1.241764560,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.006553 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5222a64045bc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.072017852 +0000 UTC m=+1.245724641,LastTimestamp:2026-03-10 00:06:27.072017852 +0000 UTC m=+1.245724641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.012060 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5222a70de7d7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.085494231 +0000 UTC m=+1.259201020,LastTimestamp:2026-03-10 00:06:27.085494231 +0000 UTC m=+1.259201020,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.015950 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222a7dfcb60 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.099249504 +0000 UTC m=+1.272956283,LastTimestamp:2026-03-10 00:06:27.099249504 +0000 UTC m=+1.272956283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.021742 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222cc3c24aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.70928145 +0000 UTC m=+1.882988209,LastTimestamp:2026-03-10 00:06:27.70928145 +0000 UTC m=+1.882988209,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.027583 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5222cc67177a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.712096122 +0000 UTC m=+1.885802912,LastTimestamp:2026-03-10 00:06:27.712096122 +0000 UTC m=+1.885802912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.033209 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b5222cc6e0379 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.712549753 +0000 UTC m=+1.886256512,LastTimestamp:2026-03-10 00:06:27.712549753 +0000 UTC m=+1.886256512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.038995 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5222ccbbcd0c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.717647628 +0000 UTC m=+1.891354387,LastTimestamp:2026-03-10 00:06:27.717647628 +0000 UTC m=+1.891354387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.042945 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222ccea8601 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.720709633 +0000 UTC m=+1.894416422,LastTimestamp:2026-03-10 00:06:27.720709633 +0000 UTC m=+1.894416422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.049711 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222cd081626 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.722647078 +0000 UTC m=+1.896353837,LastTimestamp:2026-03-10 00:06:27.722647078 +0000 UTC m=+1.896353837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.056412 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5222cd30e8d0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.725322448 +0000 UTC m=+1.899029207,LastTimestamp:2026-03-10 00:06:27.725322448 +0000 UTC m=+1.899029207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.062845 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5222cd4b2d2a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.727043882 +0000 UTC m=+1.900750671,LastTimestamp:2026-03-10 00:06:27.727043882 +0000 UTC m=+1.900750671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.069773 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b5222cddeabe4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.736710116 +0000 UTC m=+1.910416875,LastTimestamp:2026-03-10 00:06:27.736710116 +0000 UTC m=+1.910416875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.079771 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5222cdea73bf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.737482175 +0000 UTC m=+1.911188934,LastTimestamp:2026-03-10 00:06:27.737482175 +0000 UTC m=+1.911188934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.086203 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5222ce00bdb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.738942901 +0000 UTC m=+1.912649680,LastTimestamp:2026-03-10 00:06:27.738942901 +0000 UTC m=+1.912649680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.093380 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222e15a96d7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.063598295 +0000 UTC m=+2.237305074,LastTimestamp:2026-03-10 00:06:28.063598295 +0000 UTC m=+2.237305074,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.097316 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222e2468471 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.079060081 +0000 UTC m=+2.252766860,LastTimestamp:2026-03-10 00:06:28.079060081 +0000 UTC m=+2.252766860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.102751 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222e25d37a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.080547748 +0000 UTC m=+2.254254537,LastTimestamp:2026-03-10 00:06:28.080547748 +0000 UTC m=+2.254254537,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.110197 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222f1876177 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.334969207 +0000 UTC m=+2.508675986,LastTimestamp:2026-03-10 00:06:28.334969207 +0000 UTC m=+2.508675986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.116697 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222f24ff63a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.34811449 +0000 UTC m=+2.521821279,LastTimestamp:2026-03-10 00:06:28.34811449 +0000 UTC m=+2.521821279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.123035 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222f267e56a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.34968305 +0000 UTC m=+2.523389839,LastTimestamp:2026-03-10 00:06:28.34968305 +0000 UTC m=+2.523389839,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.130727 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5222ffc90f4c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.574154572 +0000 UTC m=+2.747861351,LastTimestamp:2026-03-10 00:06:28.574154572 +0000 UTC m=+2.747861351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.139464 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522300021f94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.577894292 +0000 UTC m=+2.751601051,LastTimestamp:2026-03-10 00:06:28.577894292 +0000 UTC m=+2.751601051,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.148300 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223003d0b7b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.581755771 +0000 UTC m=+2.755462560,LastTimestamp:2026-03-10 00:06:28.581755771 +0000 UTC m=+2.755462560,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.156719 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b5223008e75ea openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.587091434 +0000 UTC m=+2.760798223,LastTimestamp:2026-03-10 00:06:28.587091434 +0000 UTC m=+2.760798223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.163181 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52230214c216 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.612669974 +0000 UTC m=+2.786376763,LastTimestamp:2026-03-10 00:06:28.612669974 +0000 UTC m=+2.786376763,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.170107 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b522303db9a7e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.642478718 +0000 UTC m=+2.816185517,LastTimestamp:2026-03-10 00:06:28.642478718 +0000 UTC m=+2.816185517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.176725 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522310bf5509 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.858729737 +0000 UTC m=+3.032436496,LastTimestamp:2026-03-10 00:06:28.858729737 +0000 UTC m=+3.032436496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.183120 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522310dca24c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.86065006 +0000 UTC m=+3.034356819,LastTimestamp:2026-03-10 00:06:28.86065006 +0000 UTC m=+3.034356819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.192015 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b522310dd5941 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.860696897 +0000 UTC m=+3.034403656,LastTimestamp:2026-03-10 00:06:28.860696897 +0000 UTC m=+3.034403656,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.200126 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5223119d75cd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.873287117 +0000 UTC m=+3.046993876,LastTimestamp:2026-03-10 00:06:28.873287117 +0000 UTC m=+3.046993876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.201542 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522311bd3374 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.875367284 +0000 UTC m=+3.049074043,LastTimestamp:2026-03-10 00:06:28.875367284 +0000 UTC m=+3.049074043,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.207810 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b522311cab733 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.876252979 +0000 UTC m=+3.049959738,LastTimestamp:2026-03-10 00:06:28.876252979 +0000 UTC m=+3.049959738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.214224 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522311d61298 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.876997272 +0000 UTC m=+3.050704031,LastTimestamp:2026-03-10 00:06:28.876997272 +0000 UTC m=+3.050704031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.218333 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b522311d94bf4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.877208564 +0000 UTC m=+3.050915323,LastTimestamp:2026-03-10 00:06:28.877208564 +0000 UTC m=+3.050915323,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.224661 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522311e25e56 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.877803094 +0000 UTC m=+3.051509853,LastTimestamp:2026-03-10 00:06:28.877803094 +0000 UTC m=+3.051509853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.230538 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b522313030083 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.896718979 +0000 UTC m=+3.070425758,LastTimestamp:2026-03-10 00:06:28.896718979 +0000 UTC m=+3.070425758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.237077 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b52231fb28882 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.109549186 +0000 UTC m=+3.283255945,LastTimestamp:2026-03-10 00:06:29.109549186 +0000 UTC m=+3.283255945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.246972 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52231fb617ef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.109782511 +0000 UTC m=+3.283489260,LastTimestamp:2026-03-10 00:06:29.109782511 +0000 UTC m=+3.283489260,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.247602 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:01 crc kubenswrapper[4994]: I0310 00:07:01.252786 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.253071 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522320903add openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.124078301 +0000 UTC m=+3.297785050,LastTimestamp:2026-03-10 00:06:29.124078301 +0000 UTC m=+3.297785050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: I0310 00:07:01.255366 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:01 crc kubenswrapper[4994]: I0310 00:07:01.255424 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:01 crc kubenswrapper[4994]: I0310 00:07:01.255442 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:01 crc kubenswrapper[4994]: I0310 00:07:01.255478 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.259766 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.259916 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5223209f18ae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.12505259 +0000 UTC m=+3.298759339,LastTimestamp:2026-03-10 00:06:29.12505259 +0000 UTC m=+3.298759339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.262989 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522320dd5485 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.129131141 +0000 UTC m=+3.302837900,LastTimestamp:2026-03-10 00:06:29.129131141 +0000 UTC m=+3.302837900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.266251 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522320fe4fff openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.131292671 +0000 UTC m=+3.304999420,LastTimestamp:2026-03-10 00:06:29.131292671 +0000 UTC m=+3.304999420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.270494 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52232d23d6fa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.33507865 +0000 UTC m=+3.508785409,LastTimestamp:2026-03-10 00:06:29.33507865 +0000 UTC m=+3.508785409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.272825 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b52232d4b61e3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.337670115 +0000 UTC m=+3.511376874,LastTimestamp:2026-03-10 00:06:29.337670115 +0000 UTC m=+3.511376874,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.279094 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52232e8d0c80 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.358750848 +0000 UTC m=+3.532457607,LastTimestamp:2026-03-10 00:06:29.358750848 +0000 UTC m=+3.532457607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.285851 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52232e9fe449 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.359985737 +0000 UTC m=+3.533692526,LastTimestamp:2026-03-10 00:06:29.359985737 +0000 UTC m=+3.533692526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.292480 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b52232ebd1038 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.361897528 +0000 UTC m=+3.535604287,LastTimestamp:2026-03-10 00:06:29.361897528 +0000 UTC m=+3.535604287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.299120 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52233a447c01 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.555321857 +0000 UTC m=+3.729028616,LastTimestamp:2026-03-10 00:06:29.555321857 +0000 UTC m=+3.729028616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.306945 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52233b2da79b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.570602907 +0000 UTC m=+3.744309676,LastTimestamp:2026-03-10 00:06:29.570602907 +0000 UTC m=+3.744309676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.312346 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52233b432679 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.572011641 +0000 UTC m=+3.745718430,LastTimestamp:2026-03-10 00:06:29.572011641 +0000 UTC m=+3.745718430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.320483 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52233cf916d6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.600712406 +0000 UTC m=+3.774419165,LastTimestamp:2026-03-10 00:06:29.600712406 +0000 UTC m=+3.774419165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.327266 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52234826974b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.788243787 +0000 UTC m=+3.961950536,LastTimestamp:2026-03-10 00:06:29.788243787 +0000 UTC m=+3.961950536,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.334079 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522348fe1ae8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.80236772 +0000 UTC m=+3.976074469,LastTimestamp:2026-03-10 00:06:29.80236772 +0000 UTC m=+3.976074469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.340573 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52234a5c7918 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.825329432 +0000 UTC m=+3.999036201,LastTimestamp:2026-03-10 00:06:29.825329432 +0000 UTC m=+3.999036201,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.346759 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52234b121bb4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.837233076 +0000 UTC m=+4.010939845,LastTimestamp:2026-03-10 00:06:29.837233076 +0000 UTC m=+4.010939845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.353436 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223792fbca8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:30.61092676 +0000 UTC m=+4.784633539,LastTimestamp:2026-03-10 00:06:30.61092676 +0000 UTC m=+4.784633539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.361571 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b522389f6a077 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:30.892396663 +0000 UTC m=+5.066103442,LastTimestamp:2026-03-10 00:06:30.892396663 +0000 UTC m=+5.066103442,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.367799 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52238ab6e4b3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:30.904997043 +0000 UTC m=+5.078703832,LastTimestamp:2026-03-10 00:06:30.904997043 +0000 UTC m=+5.078703832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.373578 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52238accbb20 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:30.906428192 +0000 UTC m=+5.080134971,LastTimestamp:2026-03-10 00:06:30.906428192 +0000 UTC m=+5.080134971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.378386 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223998e4ff5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.153995765 +0000 UTC m=+5.327702544,LastTimestamp:2026-03-10 00:06:31.153995765 +0000 UTC m=+5.327702544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.384709 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52239a7b3519 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.169520921 +0000 UTC m=+5.343227710,LastTimestamp:2026-03-10 00:06:31.169520921 +0000 UTC m=+5.343227710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.391335 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52239a9691ff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.171314175 +0000 UTC m=+5.345020954,LastTimestamp:2026-03-10 00:06:31.171314175 +0000 UTC m=+5.345020954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.397315 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223aa5669d8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.435545048 +0000 UTC m=+5.609251827,LastTimestamp:2026-03-10 00:06:31.435545048 +0000 UTC m=+5.609251827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.404186 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223ab4d1e63 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.451713123 +0000 UTC m=+5.625419912,LastTimestamp:2026-03-10 00:06:31.451713123 +0000 UTC m=+5.625419912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.410344 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223ab61328d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.453029005 +0000 UTC m=+5.626735784,LastTimestamp:2026-03-10 00:06:31.453029005 +0000 UTC m=+5.626735784,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.417707 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223bb98ae27 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.725100583 +0000 UTC m=+5.898807372,LastTimestamp:2026-03-10 00:06:31.725100583 +0000 UTC m=+5.898807372,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.423939 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223bc7d2dc0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.740075456 +0000 UTC m=+5.913782235,LastTimestamp:2026-03-10 00:06:31.740075456 +0000 UTC m=+5.913782235,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.430115 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223bc90306f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.741321327 +0000 UTC m=+5.915028116,LastTimestamp:2026-03-10 00:06:31.741321327 +0000 UTC m=+5.915028116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.436523 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223cbc513ac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.996445612 +0000 UTC m=+6.170152361,LastTimestamp:2026-03-10 00:06:31.996445612 +0000 UTC m=+6.170152361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.443149 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223cc711d09 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:32.007720201 +0000 UTC m=+6.181426940,LastTimestamp:2026-03-10 00:06:32.007720201 +0000 UTC m=+6.181426940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.453084 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 00:07:01 crc kubenswrapper[4994]: &Event{ObjectMeta:{kube-controller-manager-crc.189b522548da78f9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 10 00:07:01 crc kubenswrapper[4994]: body: Mar 10 00:07:01 crc kubenswrapper[4994]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:38.389967097 +0000 UTC m=+12.563673886,LastTimestamp:2026-03-10 00:06:38.389967097 +0000 UTC m=+12.563673886,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:07:01 crc kubenswrapper[4994]: > Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.459749 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b522548dbd921 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:38.390057249 +0000 UTC m=+12.563764028,LastTimestamp:2026-03-10 00:06:38.390057249 +0000 UTC m=+12.563764028,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.466451 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 00:07:01 crc kubenswrapper[4994]: &Event{ObjectMeta:{kube-apiserver-crc.189b5225db1412d5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 00:07:01 crc kubenswrapper[4994]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 00:07:01 crc kubenswrapper[4994]: Mar 10 00:07:01 crc kubenswrapper[4994]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:40.843215573 +0000 UTC m=+15.016922332,LastTimestamp:2026-03-10 00:06:40.843215573 +0000 UTC m=+15.016922332,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:07:01 crc kubenswrapper[4994]: > Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.473065 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5225db152a55 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:40.843287125 +0000 UTC m=+15.016993894,LastTimestamp:2026-03-10 00:06:40.843287125 +0000 UTC m=+15.016993894,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.479312 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b5225db1412d5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 00:07:01 crc kubenswrapper[4994]: &Event{ObjectMeta:{kube-apiserver-crc.189b5225db1412d5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 00:07:01 crc kubenswrapper[4994]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 00:07:01 crc kubenswrapper[4994]: Mar 10 00:07:01 crc kubenswrapper[4994]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:40.843215573 +0000 UTC m=+15.016922332,LastTimestamp:2026-03-10 00:06:40.848893498 +0000 UTC m=+15.022600257,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:07:01 crc kubenswrapper[4994]: > Mar 10 00:07:01 crc kubenswrapper[4994]: I0310 00:07:01.486182 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.486282 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b5225db152a55\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5225db152a55 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:40.843287125 +0000 UTC m=+15.016993894,LastTimestamp:2026-03-10 00:06:40.84893639 +0000 UTC m=+15.022643149,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.489719 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b52233b432679\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52233b432679 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.572011641 +0000 UTC m=+3.745718430,LastTimestamp:2026-03-10 00:06:41.664645497 +0000 UTC m=+15.838352286,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.493859 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b52234826974b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52234826974b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.788243787 +0000 UTC m=+3.961950536,LastTimestamp:2026-03-10 00:06:41.960231794 +0000 UTC m=+16.133938563,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.495998 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b522348fe1ae8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522348fe1ae8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.80236772 +0000 UTC m=+3.976074469,LastTimestamp:2026-03-10 00:06:41.972024858 +0000 UTC m=+16.145731617,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.500989 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 00:07:01 crc kubenswrapper[4994]: &Event{ObjectMeta:{kube-controller-manager-crc.189b52279d00e9f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 00:07:01 crc kubenswrapper[4994]: body: Mar 10 00:07:01 crc kubenswrapper[4994]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:48.39170712 +0000 UTC m=+22.565413909,LastTimestamp:2026-03-10 00:06:48.39170712 +0000 UTC m=+22.565413909,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:07:01 crc kubenswrapper[4994]: > Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.504972 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52279d02b035 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:48.391823413 +0000 UTC m=+22.565530202,LastTimestamp:2026-03-10 00:06:48.391823413 +0000 UTC m=+22.565530202,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.508662 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b52279d00e9f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 00:07:01 crc kubenswrapper[4994]: &Event{ObjectMeta:{kube-controller-manager-crc.189b52279d00e9f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 00:07:01 crc kubenswrapper[4994]: body: Mar 10 00:07:01 crc kubenswrapper[4994]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:48.39170712 +0000 UTC m=+22.565413909,LastTimestamp:2026-03-10 00:06:58.391302984 +0000 UTC m=+32.565009773,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:07:01 crc kubenswrapper[4994]: > Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.515137 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b52279d02b035\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52279d02b035 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:48.391823413 +0000 UTC m=+22.565530202,LastTimestamp:2026-03-10 00:06:58.391362966 +0000 UTC m=+32.565069745,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.519411 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5229f1313a33 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:58.394094131 +0000 UTC m=+32.567800910,LastTimestamp:2026-03-10 00:06:58.394094131 +0000 UTC m=+32.567800910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.521663 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5222cd081626\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222cd081626 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.722647078 +0000 UTC m=+1.896353837,LastTimestamp:2026-03-10 00:06:58.516042376 +0000 UTC m=+32.689749125,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.527022 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5222e15a96d7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222e15a96d7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.063598295 +0000 UTC m=+2.237305074,LastTimestamp:2026-03-10 00:06:58.688794086 +0000 UTC m=+32.862500835,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.532517 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5222e2468471\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222e2468471 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.079060081 +0000 UTC m=+2.252766860,LastTimestamp:2026-03-10 00:06:58.700492656 +0000 UTC m=+32.874199405,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:02 crc kubenswrapper[4994]: I0310 00:07:02.483151 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.270146 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.270446 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.272197 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.272429 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.272572 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.273556 4994 scope.go:117] "RemoveContainer" containerID="ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118" Mar 10 00:07:03 crc kubenswrapper[4994]: E0310 00:07:03.274028 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.487009 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:04 crc kubenswrapper[4994]: I0310 00:07:04.487723 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.390215 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.390491 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.392058 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.392112 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.392132 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.488438 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.600856 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.621597 4994 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 00:07:06 crc kubenswrapper[4994]: I0310 00:07:06.293261 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:07:06 crc kubenswrapper[4994]: I0310 00:07:06.293525 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:06 crc kubenswrapper[4994]: I0310 00:07:06.295314 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:06 crc kubenswrapper[4994]: I0310 00:07:06.295354 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:06 crc kubenswrapper[4994]: I0310 00:07:06.295364 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:06 crc kubenswrapper[4994]: W0310 00:07:06.301395 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 00:07:06 crc kubenswrapper[4994]: E0310 00:07:06.301463 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 00:07:06 crc kubenswrapper[4994]: I0310 00:07:06.487806 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:06 crc kubenswrapper[4994]: E0310 00:07:06.641675 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:07:07 crc kubenswrapper[4994]: I0310 00:07:07.488494 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:07 crc kubenswrapper[4994]: W0310 00:07:07.589442 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 00:07:07 crc kubenswrapper[4994]: E0310 00:07:07.589525 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 00:07:08 crc kubenswrapper[4994]: E0310 00:07:08.255682 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.260845 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.262664 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.262724 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.262744 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.262783 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:08 crc kubenswrapper[4994]: E0310 00:07:08.269147 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.391124 4994 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.391207 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:07:08 crc kubenswrapper[4994]: E0310 00:07:08.398096 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b52279d00e9f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 00:07:08 crc kubenswrapper[4994]: &Event{ObjectMeta:{kube-controller-manager-crc.189b52279d00e9f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 00:07:08 crc kubenswrapper[4994]: body: Mar 10 00:07:08 crc kubenswrapper[4994]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:48.39170712 +0000 UTC m=+22.565413909,LastTimestamp:2026-03-10 00:07:08.391186487 +0000 UTC m=+42.564893276,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:07:08 crc kubenswrapper[4994]: > Mar 10 00:07:08 crc kubenswrapper[4994]: E0310 00:07:08.405940 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b52279d02b035\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52279d02b035 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:48.391823413 +0000 UTC m=+22.565530202,LastTimestamp:2026-03-10 00:07:08.391239678 +0000 UTC m=+42.564946457,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.489552 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:09 crc kubenswrapper[4994]: I0310 00:07:09.482739 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:09 crc kubenswrapper[4994]: W0310 00:07:09.798972 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 00:07:09 crc kubenswrapper[4994]: E0310 00:07:09.799053 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 00:07:10 crc kubenswrapper[4994]: I0310 00:07:10.487562 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:10 crc kubenswrapper[4994]: W0310 00:07:10.743495 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:10 crc kubenswrapper[4994]: E0310 00:07:10.743581 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 00:07:11 crc kubenswrapper[4994]: I0310 00:07:11.483704 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:12 crc kubenswrapper[4994]: I0310 00:07:12.483960 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:13 crc kubenswrapper[4994]: I0310 00:07:13.488082 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:14 crc kubenswrapper[4994]: I0310 00:07:14.487987 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:14 crc kubenswrapper[4994]: I0310 00:07:14.553767 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:14 crc kubenswrapper[4994]: I0310 00:07:14.556113 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:14 crc kubenswrapper[4994]: I0310 00:07:14.556187 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:14 crc kubenswrapper[4994]: I0310 00:07:14.556206 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:14 crc kubenswrapper[4994]: I0310 00:07:14.557125 4994 scope.go:117] "RemoveContainer" containerID="ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118" Mar 10 00:07:15 crc kubenswrapper[4994]: E0310 00:07:15.262505 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.269592 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.270914 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.270940 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.270952 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.270979 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:15 crc kubenswrapper[4994]: E0310 00:07:15.277315 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.397553 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.397703 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.398693 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.398722 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.398731 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.401775 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.487222 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.804609 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.805416 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.808270 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" exitCode=255 Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.808370 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.808343 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4"} Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.808446 4994 scope.go:117] "RemoveContainer" containerID="ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.808675 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.809265 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.809297 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.809308 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.810283 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.810310 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.810319 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.810665 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:07:15 crc kubenswrapper[4994]: E0310 00:07:15.810800 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:16 crc kubenswrapper[4994]: I0310 00:07:16.486655 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:16 crc kubenswrapper[4994]: E0310 00:07:16.641791 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:07:16 crc kubenswrapper[4994]: I0310 00:07:16.812975 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 00:07:17 crc kubenswrapper[4994]: I0310 00:07:17.183971 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:07:17 crc kubenswrapper[4994]: I0310 00:07:17.184256 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:17 crc kubenswrapper[4994]: I0310 00:07:17.185784 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:17 crc kubenswrapper[4994]: I0310 00:07:17.185852 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:17 crc kubenswrapper[4994]: I0310 00:07:17.185913 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:17 crc kubenswrapper[4994]: I0310 00:07:17.487625 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:18 crc kubenswrapper[4994]: I0310 00:07:18.487055 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:19 crc kubenswrapper[4994]: I0310 00:07:19.493192 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:20 crc kubenswrapper[4994]: I0310 00:07:20.487478 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:21 crc kubenswrapper[4994]: I0310 00:07:21.487755 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:22 crc kubenswrapper[4994]: E0310 00:07:22.270253 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:22 crc kubenswrapper[4994]: I0310 00:07:22.277625 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:22 crc kubenswrapper[4994]: I0310 00:07:22.279150 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:22 crc kubenswrapper[4994]: I0310 00:07:22.279218 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:22 crc kubenswrapper[4994]: I0310 00:07:22.279242 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:22 crc kubenswrapper[4994]: I0310 00:07:22.279283 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:22 crc kubenswrapper[4994]: E0310 00:07:22.286050 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:22 crc kubenswrapper[4994]: I0310 00:07:22.490776 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.271128 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.271628 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.273326 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.273542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.273785 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.274916 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:07:23 crc kubenswrapper[4994]: E0310 00:07:23.275320 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.488083 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.837599 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.837817 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.839119 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.839184 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.839202 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.840095 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:07:23 crc kubenswrapper[4994]: E0310 00:07:23.840373 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:24 crc kubenswrapper[4994]: I0310 00:07:24.487418 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:25 crc kubenswrapper[4994]: I0310 00:07:25.487940 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:26 crc kubenswrapper[4994]: I0310 00:07:26.487329 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:26 crc kubenswrapper[4994]: E0310 00:07:26.642200 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:07:27 crc kubenswrapper[4994]: I0310 00:07:27.487438 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:28 crc kubenswrapper[4994]: I0310 00:07:28.489182 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:29 crc kubenswrapper[4994]: E0310 00:07:29.277357 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:29 crc kubenswrapper[4994]: I0310 00:07:29.286535 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:29 crc kubenswrapper[4994]: I0310 00:07:29.288927 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:29 crc kubenswrapper[4994]: I0310 00:07:29.288987 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:29 crc kubenswrapper[4994]: I0310 00:07:29.289002 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:29 crc kubenswrapper[4994]: I0310 00:07:29.289030 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:29 crc kubenswrapper[4994]: E0310 00:07:29.296168 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:29 crc kubenswrapper[4994]: I0310 00:07:29.487665 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:30 crc kubenswrapper[4994]: I0310 00:07:30.488175 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:30 crc kubenswrapper[4994]: I0310 00:07:30.963851 4994 csr.go:261] certificate signing request csr-dn7g8 is approved, waiting to be issued Mar 10 00:07:30 crc kubenswrapper[4994]: I0310 00:07:30.973076 4994 csr.go:257] certificate signing request csr-dn7g8 is issued Mar 10 00:07:30 crc kubenswrapper[4994]: I0310 00:07:30.992035 4994 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 10 00:07:31 crc kubenswrapper[4994]: I0310 00:07:31.320842 4994 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 10 00:07:31 crc kubenswrapper[4994]: I0310 00:07:31.974528 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-27 19:39:16.68495999 +0000 UTC Mar 10 00:07:31 crc kubenswrapper[4994]: I0310 00:07:31.974588 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7027h31m44.710377243s for next certificate rotation Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.297145 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.298584 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.298644 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.298667 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.298860 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.312709 4994 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.313197 4994 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.313257 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.318575 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.318615 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.318626 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.318642 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.318654 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.334803 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.343426 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.343476 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.343494 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.343517 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.343536 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.357716 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.365197 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.365238 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.365251 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.365267 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.365280 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.377825 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.386006 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.386056 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.386075 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.386097 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.386112 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.401356 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.401611 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.401652 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.502324 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.603468 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.642448 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.704134 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.805230 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.906258 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.007355 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.108240 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.209054 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.310079 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.410474 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.511612 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.612499 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.712845 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.813967 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.914837 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.015851 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.117194 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.218413 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.319526 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.421321 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.521492 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.622117 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.722278 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.823419 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.923945 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.024458 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.125102 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.226227 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.326391 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.426528 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.527701 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: I0310 00:07:39.553602 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:39 crc kubenswrapper[4994]: I0310 00:07:39.555209 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:39 crc kubenswrapper[4994]: I0310 00:07:39.555261 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:39 crc kubenswrapper[4994]: I0310 00:07:39.555277 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:39 crc kubenswrapper[4994]: I0310 00:07:39.556186 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.556497 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.628585 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.729098 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.829486 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.930534 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.031916 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.133147 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.233656 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.334735 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.436221 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.537432 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.637983 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.738501 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.839070 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: I0310 00:07:40.904788 4994 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.940066 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.040480 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.141260 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.242606 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.342957 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.443109 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.544092 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.644783 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.745163 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.846104 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.946280 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.046833 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.147120 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.247383 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.348489 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.449794 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.550646 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.651325 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.752009 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.852365 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.952677 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.053244 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.154284 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.254683 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.355794 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.455974 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.556291 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.657148 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.757591 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.857999 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.959119 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.060417 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.160609 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.261523 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.362011 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.463218 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.563631 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.664602 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.765320 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.865487 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.966361 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.066499 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.167495 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.268803 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.369668 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.471378 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.571727 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.671934 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.773113 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.874282 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.974405 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.075184 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.175377 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.276054 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.376552 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.476642 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.517090 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.522597 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.522643 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.522660 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.522682 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.522702 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.539473 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.545562 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.545637 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.545659 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.546158 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.546226 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.563319 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.568901 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.568962 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.568980 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.569391 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.569443 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.585034 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.590454 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.590521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.590548 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.590581 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.590606 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.605475 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.605691 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.605724 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.643539 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.706107 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.807216 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.908131 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.008718 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.109605 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.210707 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.311741 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.412046 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.512924 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.613634 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.655237 4994 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.717590 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.717636 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.717653 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.717678 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.717695 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.820801 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.820856 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.820913 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.820941 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.820959 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.924124 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.924184 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.924202 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.924228 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.924247 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.028460 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.028859 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.029060 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.029227 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.029401 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.132786 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.132850 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.132900 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.132927 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.132949 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.236143 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.236472 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.236607 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.236739 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.236904 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.340291 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.340356 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.340378 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.340407 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.340428 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.443844 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.443916 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.443928 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.443946 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.443958 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.513967 4994 apiserver.go:52] "Watching apiserver" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.521395 4994 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.522056 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jhp6z","openshift-multus/multus-mcxcb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn","openshift-dns/node-resolver-24l69","openshift-machine-config-operator/machine-config-daemon-kfljj","openshift-multus/multus-additional-cni-plugins-b2f6h","openshift-multus/network-metrics-daemon-vxjt2","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-ovn-kubernetes/ovnkube-node-ns797","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.522535 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.522775 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.523200 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.523239 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.523264 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.523673 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.523751 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.524275 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.524511 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.524555 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.524605 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.525464 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.524417 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.525368 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.526066 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.524628 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.526996 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.527967 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.537141 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.537240 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.537494 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.537977 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.538165 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.538267 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.538415 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.538477 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.538922 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.539484 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542127 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542168 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542445 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542630 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542707 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542765 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542973 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543026 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543067 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543231 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543308 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543327 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543497 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543587 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543711 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543593 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543913 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.544013 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542466 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.544267 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.544317 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.544483 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.544709 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.545018 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.545655 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.545773 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.545132 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.547202 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.547248 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.547272 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.547304 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.547330 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.570077 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.584353 4994 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.584364 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599376 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599417 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599444 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599466 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599487 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599508 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599529 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599549 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599573 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599594 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599615 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599634 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599654 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599673 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599692 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599714 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599737 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599841 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599868 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599918 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599939 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599962 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599982 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600006 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600027 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600048 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600068 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600092 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600111 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600132 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600154 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600176 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600198 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600219 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600244 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600267 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600291 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600311 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600333 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600355 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600375 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600395 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600416 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600438 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600458 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600478 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600501 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600522 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600542 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600562 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600583 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600606 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600627 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600650 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600672 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600692 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600712 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600732 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600753 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600775 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600795 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600818 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600839 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600862 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600911 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600933 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600954 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600976 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601004 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600994 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601028 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601140 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601202 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601359 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601731 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601798 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601851 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601938 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601991 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.602189 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:49.102025767 +0000 UTC m=+83.275732616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.602257 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.602318 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.602359 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.602640 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.602740 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.603064 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.603341 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.603764 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.604123 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.604209 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.604844 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.603369 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.603361 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.605319 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.605359 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.605283 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.605708 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.606038 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.606171 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.606470 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.606703 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.607544 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.607567 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.607939 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.608662 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.608722 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.608751 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.608766 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.608823 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.609046 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.609068 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.609410 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.610340 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.610730 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.610767 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.610983 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611542 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611556 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611605 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611619 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.603517 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611744 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611834 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611925 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611963 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612012 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612061 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612116 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612166 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612232 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612277 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612387 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612435 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612470 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612501 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612533 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612567 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612600 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612637 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612674 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612708 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612742 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612777 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612821 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612867 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612992 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613031 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613113 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613214 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613253 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613291 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613330 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613367 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613402 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613437 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613473 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613510 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613547 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613583 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613622 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613657 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613694 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613728 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613800 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613838 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613914 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613953 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613993 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614031 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614069 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614108 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614146 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614181 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614220 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614275 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614328 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614366 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614407 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614443 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612067 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612064 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612239 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612293 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612312 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612432 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612755 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612955 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613467 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613473 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613679 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613757 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614190 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614198 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614530 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.615171 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.615753 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.615776 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.616400 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.616476 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.616855 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.617191 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.617766 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.617807 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.618414 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.618714 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614539 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619101 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619152 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619193 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619231 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619265 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619309 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619350 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619380 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619418 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619449 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619485 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619515 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619544 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619572 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619601 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619632 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619693 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619725 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619757 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619790 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619826 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619857 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619918 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619959 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619991 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620020 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620055 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620088 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620129 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620161 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620190 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620223 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620284 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620327 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620364 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620405 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620435 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620467 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620501 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620538 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620570 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620603 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620635 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620665 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620698 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620738 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620769 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620800 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620915 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620953 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620985 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621017 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621052 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621084 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621116 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621158 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621296 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621349 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-bin\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621381 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-netd\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621411 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovn-node-metrics-cert\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621448 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-cni-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621480 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621556 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-cni-bin\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621609 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621691 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621731 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-socket-dir-parent\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621763 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-env-overrides\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621800 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621838 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621893 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-etc-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621929 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621976 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622016 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkj5\" (UniqueName: \"kubernetes.io/projected/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-kube-api-access-hwkj5\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622051 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ced5d66d-39df-4267-b801-e1e60d517ace-mcd-auth-proxy-config\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622089 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r5sl\" (UniqueName: \"kubernetes.io/projected/ced5d66d-39df-4267-b801-e1e60d517ace-kube-api-access-9r5sl\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622118 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-log-socket\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622148 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-env-overrides\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622179 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dkqt\" (UniqueName: \"kubernetes.io/projected/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-kube-api-access-7dkqt\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622209 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-cni-multus\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622240 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfsnl\" (UniqueName: \"kubernetes.io/projected/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-kube-api-access-kfsnl\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622306 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ced5d66d-39df-4267-b801-e1e60d517ace-proxy-tls\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622352 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622383 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s42gc\" (UniqueName: \"kubernetes.io/projected/72a13a81-4c11-4529-8a3d-2dd3c73215a7-kube-api-access-s42gc\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622416 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-conf-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622803 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-etc-kubernetes\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622865 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-serviceca\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623955 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-cnibin\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624008 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-netns\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624421 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624479 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624522 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624560 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624599 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-slash\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624630 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-ovn\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624668 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624703 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624746 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624783 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624821 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624853 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-daemon-config\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624934 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-k8s-cni-cncf-io\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624974 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-kubelet\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625000 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-os-release\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625023 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-os-release\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625045 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-multus-certs\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625070 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625093 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5gdt\" (UniqueName: \"kubernetes.io/projected/f4c125b3-4a9c-46a7-a468-54e93c44751d-kube-api-access-m5gdt\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625136 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/194b252b-4eca-42f4-85e1-5c51a42eb407-hosts-file\") pod \"node-resolver-24l69\" (UID: \"194b252b-4eca-42f4-85e1-5c51a42eb407\") " pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625158 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-systemd-units\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625181 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625203 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-config\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625224 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-script-lib\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625271 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-system-cni-dir\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625294 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625315 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-host\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625336 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-kubelet\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625358 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-systemd\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625378 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cnibin\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625399 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cni-binary-copy\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625420 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-system-cni-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625441 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-netns\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625466 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625490 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrsmg\" (UniqueName: \"kubernetes.io/projected/194b252b-4eca-42f4-85e1-5c51a42eb407-kube-api-access-hrsmg\") pod \"node-resolver-24l69\" (UID: \"194b252b-4eca-42f4-85e1-5c51a42eb407\") " pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625511 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ced5d66d-39df-4267-b801-e1e60d517ace-rootfs\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625807 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-var-lib-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625848 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6dac87a5-07eb-488d-85fe-cb8848434ae5-cni-binary-copy\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625904 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-hostroot\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625933 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4lnm\" (UniqueName: \"kubernetes.io/projected/6dac87a5-07eb-488d-85fe-cb8848434ae5-kube-api-access-k4lnm\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626058 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-node-log\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626259 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626471 4994 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626499 4994 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626519 4994 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626574 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626599 4994 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.628631 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.629566 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619270 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619454 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619675 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619642 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620063 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620227 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620299 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620934 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621736 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621847 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622666 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622705 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623063 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623405 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623736 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623760 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.630071 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.630221 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.630306 4994 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.630360 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.630446 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.630495 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.631045 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.631122 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.631188 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.631378 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.632320 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.632349 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.632344 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.634176 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623845 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623928 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623940 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624059 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624085 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624336 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624473 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624514 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624530 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624555 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624601 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624651 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625222 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625248 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625517 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625660 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626182 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626246 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.626533 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626841 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.627073 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.627776 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.628257 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.628557 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.628563 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.640406 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.628576 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.627849 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.628682 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.629052 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.629104 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.629125 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.629580 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.634582 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.634795 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.635061 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.635157 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.635612 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.635857 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.636409 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.637259 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.637280 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638015 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638016 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638087 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638246 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638301 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638575 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638599 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638700 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638802 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638864 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639030 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639088 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639535 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639560 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639594 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639695 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639804 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639952 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639979 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639971 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.640049 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.640129 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.640211 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.640607 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.640691 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.641051 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.641088 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.641221 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.641465 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.641505 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.641171 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623779 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639410 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.644974 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.645073 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.645524 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.645695 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.645844 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.645944 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.646274 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.646327 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.646458 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.646552 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.646580 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.647012 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:49.146989137 +0000 UTC m=+83.320695896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.647180 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.647308 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:49.147292217 +0000 UTC m=+83.320998966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648292 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648385 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648455 4994 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648516 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648574 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648634 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648692 4994 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648749 4994 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648807 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648861 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648952 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649008 4994 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649060 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649115 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649178 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649233 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649292 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649345 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649400 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649457 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649508 4994 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649570 4994 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649631 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649688 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649744 4994 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650567 4994 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650616 4994 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650639 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650666 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650703 4994 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650839 4994 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650966 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651006 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651025 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651042 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651060 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651084 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651102 4994 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651213 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651293 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651327 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651345 4994 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651362 4994 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651383 4994 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651482 4994 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651501 4994 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651519 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651543 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651558 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651571 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651584 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651600 4994 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651612 4994 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651624 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651650 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651663 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651676 4994 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651689 4994 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651705 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.655111 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.655423 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.655439 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.655746 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.655774 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.656179 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.657431 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.658037 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.659888 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.660169 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.661284 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.661473 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.661580 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.662062 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.664101 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.664579 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.664740 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.664860 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.664926 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.664943 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.665036 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:49.16501547 +0000 UTC m=+83.338722239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.665221 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.670715 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.670761 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.670943 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.670717 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.671037 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.671058 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:49.171020835 +0000 UTC m=+83.344727624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.671060 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.671116 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.671141 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.674673 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.674737 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.675505 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.684511 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.685257 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.685646 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.694587 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.696029 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.705652 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.716272 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.728843 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.737232 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.746823 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753320 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/194b252b-4eca-42f4-85e1-5c51a42eb407-hosts-file\") pod \"node-resolver-24l69\" (UID: \"194b252b-4eca-42f4-85e1-5c51a42eb407\") " pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753428 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-systemd-units\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753459 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753482 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-config\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753527 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-script-lib\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753554 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-system-cni-dir\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753576 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-host\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753597 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-kubelet\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753607 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-systemd-units\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753727 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/194b252b-4eca-42f4-85e1-5c51a42eb407-hosts-file\") pod \"node-resolver-24l69\" (UID: \"194b252b-4eca-42f4-85e1-5c51a42eb407\") " pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753689 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753764 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cnibin\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753783 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cni-binary-copy\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753800 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-system-cni-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753815 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-netns\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753831 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-systemd\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753848 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrsmg\" (UniqueName: \"kubernetes.io/projected/194b252b-4eca-42f4-85e1-5c51a42eb407-kube-api-access-hrsmg\") pod \"node-resolver-24l69\" (UID: \"194b252b-4eca-42f4-85e1-5c51a42eb407\") " pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753885 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ced5d66d-39df-4267-b801-e1e60d517ace-rootfs\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753902 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-var-lib-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753918 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6dac87a5-07eb-488d-85fe-cb8848434ae5-cni-binary-copy\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753936 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4lnm\" (UniqueName: \"kubernetes.io/projected/6dac87a5-07eb-488d-85fe-cb8848434ae5-kube-api-access-k4lnm\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753952 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-node-log\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753969 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-hostroot\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753996 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-bin\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754014 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-netd\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754030 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovn-node-metrics-cert\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754045 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-cni-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754063 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754081 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754100 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-cni-bin\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754085 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-host\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754126 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-socket-dir-parent\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754161 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cnibin\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753766 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754187 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-system-cni-dir\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754227 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-env-overrides\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754340 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754504 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ced5d66d-39df-4267-b801-e1e60d517ace-rootfs\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754521 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-socket-dir-parent\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754553 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-var-lib-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754599 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-bin\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754605 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-netns\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754638 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-systemd\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754655 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-kubelet\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754649 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754695 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cni-binary-copy\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754728 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754735 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754745 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-netd\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754813 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-node-log\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754737 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-etc-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754942 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-system-cni-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754972 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754992 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-cni-bin\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754998 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755051 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-cni-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755090 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkj5\" (UniqueName: \"kubernetes.io/projected/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-kube-api-access-hwkj5\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755121 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-config\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.755136 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755174 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-script-lib\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755183 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-hostroot\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755183 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-etc-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755219 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755296 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r5sl\" (UniqueName: \"kubernetes.io/projected/ced5d66d-39df-4267-b801-e1e60d517ace-kube-api-access-9r5sl\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755317 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-log-socket\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.755366 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:07:49.255342935 +0000 UTC m=+83.429049694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755412 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-env-overrides\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755443 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dkqt\" (UniqueName: \"kubernetes.io/projected/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-kube-api-access-7dkqt\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755523 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-log-socket\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755605 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-cni-multus\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755634 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfsnl\" (UniqueName: \"kubernetes.io/projected/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-kube-api-access-kfsnl\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755707 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ced5d66d-39df-4267-b801-e1e60d517ace-proxy-tls\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755850 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-cni-multus\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755930 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ced5d66d-39df-4267-b801-e1e60d517ace-mcd-auth-proxy-config\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.756471 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ced5d66d-39df-4267-b801-e1e60d517ace-mcd-auth-proxy-config\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.757665 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-env-overrides\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.757734 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758009 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s42gc\" (UniqueName: \"kubernetes.io/projected/72a13a81-4c11-4529-8a3d-2dd3c73215a7-kube-api-access-s42gc\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758101 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-conf-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758224 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-etc-kubernetes\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758418 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-serviceca\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758456 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758477 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-env-overrides\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758490 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-netns\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758508 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-etc-kubernetes\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758573 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-cnibin\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758605 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-ovn\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758664 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758678 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-conf-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758715 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-slash\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758755 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-daemon-config\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758808 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758927 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-k8s-cni-cncf-io\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758981 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-os-release\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759013 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-os-release\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759045 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-multus-certs\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759078 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5gdt\" (UniqueName: \"kubernetes.io/projected/f4c125b3-4a9c-46a7-a468-54e93c44751d-kube-api-access-m5gdt\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759114 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-kubelet\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759165 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759191 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-netns\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759207 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759224 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-ovn\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759246 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-cnibin\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759293 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-os-release\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759496 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6dac87a5-07eb-488d-85fe-cb8848434ae5-cni-binary-copy\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759554 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-slash\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760349 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760373 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-k8s-cni-cncf-io\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760406 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-kubelet\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760422 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-multus-certs\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760473 4994 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760485 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760494 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760505 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760517 4994 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760526 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760536 4994 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760545 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760555 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760565 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760576 4994 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760585 4994 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760595 4994 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760604 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760614 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760624 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760633 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760641 4994 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760650 4994 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760660 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760669 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760677 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760686 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760694 4994 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760703 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760711 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760720 4994 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760730 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760739 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760747 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760756 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760765 4994 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760774 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760781 4994 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760791 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760799 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760807 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760815 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760824 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760832 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760840 4994 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760849 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760858 4994 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760884 4994 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760893 4994 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760902 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760910 4994 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760918 4994 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760927 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760936 4994 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760945 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760953 4994 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760962 4994 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760972 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760981 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760991 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761001 4994 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761010 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761020 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761030 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761038 4994 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761047 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761055 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761064 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761072 4994 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761080 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761089 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761099 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761108 4994 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761117 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761126 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761134 4994 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761142 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761151 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761160 4994 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761169 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761178 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761187 4994 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761196 4994 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761204 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761214 4994 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761223 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761231 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761239 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761248 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761256 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761265 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761273 4994 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761282 4994 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761291 4994 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761300 4994 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761309 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761318 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761326 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761335 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761343 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761351 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761360 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761370 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761379 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761386 4994 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761395 4994 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761404 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761412 4994 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761420 4994 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761428 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761437 4994 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761445 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761454 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761462 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761470 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761479 4994 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761488 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761497 4994 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761505 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761513 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761521 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761533 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761541 4994 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761549 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761557 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761565 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761573 4994 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761582 4994 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761590 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761597 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761606 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761614 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761623 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.762383 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-daemon-config\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.762735 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ced5d66d-39df-4267-b801-e1e60d517ace-proxy-tls\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.763746 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-serviceca\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.766596 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovn-node-metrics-cert\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.771298 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.771417 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-os-release\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.780247 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.782722 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkj5\" (UniqueName: \"kubernetes.io/projected/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-kube-api-access-hwkj5\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.784448 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfsnl\" (UniqueName: \"kubernetes.io/projected/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-kube-api-access-kfsnl\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.785694 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4lnm\" (UniqueName: \"kubernetes.io/projected/6dac87a5-07eb-488d-85fe-cb8848434ae5-kube-api-access-k4lnm\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.785975 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r5sl\" (UniqueName: \"kubernetes.io/projected/ced5d66d-39df-4267-b801-e1e60d517ace-kube-api-access-9r5sl\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.787118 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.787143 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.787151 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.787165 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.787174 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.790053 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dkqt\" (UniqueName: \"kubernetes.io/projected/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-kube-api-access-7dkqt\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.790071 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrsmg\" (UniqueName: \"kubernetes.io/projected/194b252b-4eca-42f4-85e1-5c51a42eb407-kube-api-access-hrsmg\") pod \"node-resolver-24l69\" (UID: \"194b252b-4eca-42f4-85e1-5c51a42eb407\") " pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.791356 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5gdt\" (UniqueName: \"kubernetes.io/projected/f4c125b3-4a9c-46a7-a468-54e93c44751d-kube-api-access-m5gdt\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.791976 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s42gc\" (UniqueName: \"kubernetes.io/projected/72a13a81-4c11-4529-8a3d-2dd3c73215a7-kube-api-access-s42gc\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.795274 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.851238 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.860856 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.870942 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: W0310 00:07:48.872097 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-e966cda8e8654747d75214e22bb99cbbf074fa8600f3d57f77ebee1a12ed4778 WatchSource:0}: Error finding container e966cda8e8654747d75214e22bb99cbbf074fa8600f3d57f77ebee1a12ed4778: Status 404 returned error can't find the container with id e966cda8e8654747d75214e22bb99cbbf074fa8600f3d57f77ebee1a12ed4778 Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.880977 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: W0310 00:07:48.882058 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-9887da5925f5a1e634859afec8aa0e2b7ceedea60ce74d5b0fdf9e04e500d942 WatchSource:0}: Error finding container 9887da5925f5a1e634859afec8aa0e2b7ceedea60ce74d5b0fdf9e04e500d942: Status 404 returned error can't find the container with id 9887da5925f5a1e634859afec8aa0e2b7ceedea60ce74d5b0fdf9e04e500d942 Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.889665 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.889714 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.889732 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.889756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.889774 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.891520 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: W0310 00:07:48.892612 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-ffb01b4eb5e35ecfb3ea97a0de5f6095a52ba4b2ae1914794e1a1e28f636cc35 WatchSource:0}: Error finding container ffb01b4eb5e35ecfb3ea97a0de5f6095a52ba4b2ae1914794e1a1e28f636cc35: Status 404 returned error can't find the container with id ffb01b4eb5e35ecfb3ea97a0de5f6095a52ba4b2ae1914794e1a1e28f636cc35 Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.902361 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ffb01b4eb5e35ecfb3ea97a0de5f6095a52ba4b2ae1914794e1a1e28f636cc35"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.903624 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.904183 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9887da5925f5a1e634859afec8aa0e2b7ceedea60ce74d5b0fdf9e04e500d942"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.905770 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e966cda8e8654747d75214e22bb99cbbf074fa8600f3d57f77ebee1a12ed4778"} Mar 10 00:07:48 crc kubenswrapper[4994]: W0310 00:07:48.921044 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194b252b_4eca_42f4_85e1_5c51a42eb407.slice/crio-5e381bb4e65c0584431ab12bdf67696869975d892e7a898c77054c1c289aa596 WatchSource:0}: Error finding container 5e381bb4e65c0584431ab12bdf67696869975d892e7a898c77054c1c289aa596: Status 404 returned error can't find the container with id 5e381bb4e65c0584431ab12bdf67696869975d892e7a898c77054c1c289aa596 Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.926783 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: W0310 00:07:48.944780 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72a13a81_4c11_4529_8a3d_2dd3c73215a7.slice/crio-b8da989a4363394b0ce6c6c658409a1fc0f3d0c82d2ec6c0704e2ab145277cf7 WatchSource:0}: Error finding container b8da989a4363394b0ce6c6c658409a1fc0f3d0c82d2ec6c0704e2ab145277cf7: Status 404 returned error can't find the container with id b8da989a4363394b0ce6c6c658409a1fc0f3d0c82d2ec6c0704e2ab145277cf7 Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.982370 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.991955 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.991988 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.991998 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.992014 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.992026 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.993809 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.001602 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:49 crc kubenswrapper[4994]: W0310 00:07:49.010317 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dac87a5_07eb_488d_85fe_cb8848434ae5.slice/crio-7810b3b0c224bd62f286711d7bde48e3d72f3ed4791704b05bc78daa14b5ff2c WatchSource:0}: Error finding container 7810b3b0c224bd62f286711d7bde48e3d72f3ed4791704b05bc78daa14b5ff2c: Status 404 returned error can't find the container with id 7810b3b0c224bd62f286711d7bde48e3d72f3ed4791704b05bc78daa14b5ff2c Mar 10 00:07:49 crc kubenswrapper[4994]: W0310 00:07:49.052565 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c9fd1f0_58d6_4986_86b5_8c26c871e79b.slice/crio-607b6977609ad51e465ced1835f89caa686f47e8d1b53335c04f3b1e0107a67b WatchSource:0}: Error finding container 607b6977609ad51e465ced1835f89caa686f47e8d1b53335c04f3b1e0107a67b: Status 404 returned error can't find the container with id 607b6977609ad51e465ced1835f89caa686f47e8d1b53335c04f3b1e0107a67b Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.095085 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.095129 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.095140 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.095159 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.095172 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.167294 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.167419 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.167448 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167498 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:50.167461682 +0000 UTC m=+84.341168431 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167536 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167591 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:50.167577826 +0000 UTC m=+84.341284575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167618 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167649 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167660 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.167661 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167712 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:50.16769624 +0000 UTC m=+84.341402989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167784 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167839 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:50.167830104 +0000 UTC m=+84.341536943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.199299 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.199329 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.199339 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.199353 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.199365 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.268130 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.268362 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.268380 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.268391 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.268914 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.269224 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:50.269209204 +0000 UTC m=+84.442915953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.269433 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.269465 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:07:50.269456793 +0000 UTC m=+84.443163542 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.301976 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.302385 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.302396 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.302413 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.302422 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.404346 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.404381 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.404394 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.404418 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.404428 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.507348 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.507386 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.507395 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.507410 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.507420 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.611591 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.611650 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.611668 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.611691 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.611708 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.715300 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.715348 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.715359 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.715375 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.715389 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.773619 4994 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.818413 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.818459 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.818470 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.818487 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.818500 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.913778 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jhp6z" event={"ID":"7c9fd1f0-58d6-4986-86b5-8c26c871e79b","Type":"ContainerStarted","Data":"61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.913829 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jhp6z" event={"ID":"7c9fd1f0-58d6-4986-86b5-8c26c871e79b","Type":"ContainerStarted","Data":"607b6977609ad51e465ced1835f89caa686f47e8d1b53335c04f3b1e0107a67b"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.917516 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" event={"ID":"cd1d8032-7c65-474f-9a19-a93bf0cac8ba","Type":"ContainerStarted","Data":"64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.917548 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" event={"ID":"cd1d8032-7c65-474f-9a19-a93bf0cac8ba","Type":"ContainerStarted","Data":"049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.917564 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" event={"ID":"cd1d8032-7c65-474f-9a19-a93bf0cac8ba","Type":"ContainerStarted","Data":"72de72f0dc1d88f178eb20671aedb9c97f4717c5ab9c8ccae29de4193ac08349"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.920043 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6" exitCode=0 Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.920132 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.920184 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"b8da989a4363394b0ce6c6c658409a1fc0f3d0c82d2ec6c0704e2ab145277cf7"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.922802 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.922863 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.922924 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.922953 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.922976 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.924565 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.928077 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.928141 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.928165 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"3b13cdfd69ef4b401eb74eaace11f32c9b70f675cf90d3ad73f8b7acf3371165"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.931044 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.931129 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.932865 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerStarted","Data":"5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.932915 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerStarted","Data":"7810b3b0c224bd62f286711d7bde48e3d72f3ed4791704b05bc78daa14b5ff2c"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.935395 4994 generic.go:334] "Generic (PLEG): container finished" podID="2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08" containerID="23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3" exitCode=0 Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.935490 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerDied","Data":"23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.935589 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerStarted","Data":"5dbf18b53d3f7ca2f039c6e87cadd8f5dd12f1d848f94884c1284843cc640226"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.937490 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-24l69" event={"ID":"194b252b-4eca-42f4-85e1-5c51a42eb407","Type":"ContainerStarted","Data":"d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.937530 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-24l69" event={"ID":"194b252b-4eca-42f4-85e1-5c51a42eb407","Type":"ContainerStarted","Data":"5e381bb4e65c0584431ab12bdf67696869975d892e7a898c77054c1c289aa596"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.939022 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.962420 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.979035 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.998266 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.021403 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.025563 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.025596 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.025609 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.025626 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.025638 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.047619 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.062826 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.075171 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.091227 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.108952 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.125420 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.128182 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.128224 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.128235 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.128253 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.128264 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.140310 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.157865 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.177837 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.180255 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.180351 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.180381 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.180401 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180490 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180538 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:52.180526688 +0000 UTC m=+86.354233437 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180593 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180604 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180613 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180634 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180655 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:52.180625621 +0000 UTC m=+86.354332400 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180693 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:52.180680263 +0000 UTC m=+86.354387052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180715 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:52.180704004 +0000 UTC m=+86.354410793 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.200096 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.226802 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.229942 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.229966 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.229974 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.229987 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.229995 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.247179 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.260537 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.273315 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.281002 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.281066 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.281167 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.281191 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.281201 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.281207 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.281250 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:52.281234045 +0000 UTC m=+86.454940794 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.281269 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:07:52.281261716 +0000 UTC m=+86.454968465 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.281669 4994 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.289595 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.302719 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.319352 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.332774 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.332817 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.332833 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.332856 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.332896 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.334008 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.347713 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.361410 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.379195 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.388958 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.400419 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.435744 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.435773 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.435782 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.435796 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.435807 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.538558 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.538618 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.538636 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.538661 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.538683 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.553887 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.553905 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.553941 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.553989 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.554078 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.554164 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.554172 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.554356 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.558689 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.559578 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.560999 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.561902 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.563004 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.563603 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.564290 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.565344 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.566073 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.567102 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.567705 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.570346 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.571137 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.571772 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.572454 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.573009 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.573896 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.574563 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.575449 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.576311 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.577016 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.577752 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.578361 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.581287 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.581837 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.582434 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.583058 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.583584 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.585034 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.585510 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.585957 4994 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.586052 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.587308 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.587812 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.589238 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.590695 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.591379 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.592270 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.592910 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.593909 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.594374 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.595432 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.596234 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.597153 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.597583 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.598414 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.599210 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.600362 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.600842 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.601622 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.602074 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.602958 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.603718 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.604185 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.641518 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.641585 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.641604 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.641628 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.641647 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.744338 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.744842 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.744917 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.744955 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.744977 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.847675 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.847711 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.847720 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.847734 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.847744 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.944278 4994 generic.go:334] "Generic (PLEG): container finished" podID="2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08" containerID="a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3" exitCode=0 Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.944351 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerDied","Data":"a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.952157 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.952397 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.952488 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.952571 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.952652 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.960170 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.960336 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.960937 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.961102 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.971107 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.989035 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.007964 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.022278 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.037523 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.057009 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.057048 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.057060 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.057086 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.057099 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.058703 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.121356 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.143472 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.162689 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.163002 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.163014 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.163032 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.163045 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.164414 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.197724 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.212343 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.223949 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.237077 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.252312 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.265248 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.265284 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.265293 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.265307 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.265319 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.367655 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.367688 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.367699 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.367714 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.367725 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.470098 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.470153 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.470171 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.470197 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.470218 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.568809 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:07:51 crc kubenswrapper[4994]: E0310 00:07:51.569159 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.570145 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.573137 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.573191 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.573211 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.573233 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.573255 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.677455 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.677521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.677542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.677567 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.677589 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.780352 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.780409 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.780422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.780443 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.780457 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.884426 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.884486 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.884503 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.884527 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.884546 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.967738 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerDied","Data":"a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.967725 4994 generic.go:334] "Generic (PLEG): container finished" podID="2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08" containerID="a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877" exitCode=0 Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.977828 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.977962 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.978608 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:07:51 crc kubenswrapper[4994]: E0310 00:07:51.978904 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.987389 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.987464 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.987486 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.987514 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.987536 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.997107 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.015805 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.032504 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.053651 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.070074 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.089437 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.090399 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.090440 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.090453 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.090473 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.090485 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.115755 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.136050 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.149638 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.163287 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.176988 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.193778 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.194449 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.194503 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.194521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.194542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.194555 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.197081 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197272 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.197235211 +0000 UTC m=+90.370941990 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.197379 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.197450 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.197493 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197567 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197568 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197624 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.197607174 +0000 UTC m=+90.371313923 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197669 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.197650645 +0000 UTC m=+90.371357404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197716 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197739 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197759 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197812 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.19779727 +0000 UTC m=+90.371504049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.211064 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.234400 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.253284 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.297154 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.297203 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.297217 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.297235 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.297247 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.303252 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.303417 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.303463 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.303543 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.303522678 +0000 UTC m=+90.477229447 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.303617 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.303645 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.303664 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.303735 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.303710175 +0000 UTC m=+90.477416964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.400299 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.400358 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.400378 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.400404 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.400423 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.505820 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.505919 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.505937 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.505966 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.505990 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.553829 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.553897 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.553960 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.553969 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.554038 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.554203 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.554258 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.554389 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.608543 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.609562 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.609591 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.609613 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.609625 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.713487 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.713573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.713592 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.713623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.713647 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.816143 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.816209 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.816231 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.816263 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.816286 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.919186 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.919237 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.919254 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.919278 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.919295 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.983822 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.988245 4994 generic.go:334] "Generic (PLEG): container finished" podID="2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08" containerID="ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58" exitCode=0 Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.988296 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerDied","Data":"ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.012304 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.022644 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.022772 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.022799 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.022828 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.024220 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.037364 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.060981 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.078807 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.098600 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.121122 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.127174 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.127231 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.127251 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.127278 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.127299 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.153368 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.169194 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.182962 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.198452 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.217962 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.232582 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.232613 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.232623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.232638 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.232649 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.235111 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.260662 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.275167 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.291228 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.310435 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.327055 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.336051 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.336105 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.336128 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.336159 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.336182 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.348343 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.375297 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.391855 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.410798 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.425645 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.439567 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.439598 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.439611 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.439626 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.439638 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.441018 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.454512 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.470621 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.478811 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.499835 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.510528 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.520216 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.530717 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.542129 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.542318 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.542395 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.542470 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.542547 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.645693 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.645736 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.645744 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.645757 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.645765 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.748347 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.750098 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.750843 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.751866 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.752047 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.855086 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.855147 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.855165 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.855194 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.855212 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.958638 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.958703 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.958721 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.958747 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.958765 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.994858 4994 generic.go:334] "Generic (PLEG): container finished" podID="2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08" containerID="59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1" exitCode=0 Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.995229 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerDied","Data":"59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.012468 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.013922 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.028356 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.040916 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.060531 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.064704 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.064971 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.065140 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.065427 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.065602 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.073537 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.089785 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.123885 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.134107 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.144736 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.159148 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.168743 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.168783 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.168797 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.168813 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.168824 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.171274 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.180673 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.196327 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.215636 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.227083 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.271346 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.271388 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.271399 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.271416 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.271429 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.373215 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.373568 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.373750 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.373984 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.374198 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.477552 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.477944 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.478193 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.478399 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.478587 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.553334 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.553410 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.553561 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:54 crc kubenswrapper[4994]: E0310 00:07:54.554228 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:54 crc kubenswrapper[4994]: E0310 00:07:54.554372 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:54 crc kubenswrapper[4994]: E0310 00:07:54.554412 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.554859 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:54 crc kubenswrapper[4994]: E0310 00:07:54.555246 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.582510 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.582569 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.582588 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.582610 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.582631 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.685706 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.685773 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.685793 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.685820 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.685839 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.789266 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.789323 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.789341 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.789367 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.789386 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.892432 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.892502 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.892527 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.892558 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.892580 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.996330 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.996673 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.998711 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.998860 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.999028 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.024190 4994 generic.go:334] "Generic (PLEG): container finished" podID="2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08" containerID="74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9" exitCode=0 Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.024261 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerDied","Data":"74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.048592 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.066237 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.090480 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.105540 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.105620 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.105645 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.105677 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.105700 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.106854 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.123430 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.142994 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.173909 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.188394 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.204425 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.209167 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.209212 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.209231 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.209255 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.209273 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.222760 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.238739 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.253413 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.271809 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.286700 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.305858 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.311340 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.311360 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.311370 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.311383 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.311392 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.413313 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.413379 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.413397 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.413422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.413441 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.516645 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.517051 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.517069 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.517095 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.517110 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.566367 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.621093 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.621126 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.621137 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.621155 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.621167 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.723544 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.723592 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.723606 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.723623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.723634 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.825779 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.825813 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.825823 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.825838 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.825849 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.927940 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.927965 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.927973 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.927986 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.927994 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.030525 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.030562 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.030573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.030591 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.030602 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.033799 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerStarted","Data":"244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.040423 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.047544 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.058212 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.069069 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.080612 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.089701 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.112075 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.133529 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.133579 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.133593 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.133613 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.133637 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.139326 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.161961 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.183808 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.201657 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.215484 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.224812 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.234041 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.235671 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.235762 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.235943 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.236104 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.236197 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.245520 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.256231 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.260596 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.260745 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:08:04.260722411 +0000 UTC m=+98.434429170 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.260919 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.261052 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.261158 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261002 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261368 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:04.261351953 +0000 UTC m=+98.435058702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261192 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261553 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:04.261544739 +0000 UTC m=+98.435251488 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261261 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261748 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261808 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261903 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:04.261895281 +0000 UTC m=+98.435602030 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.313634 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.327661 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.338179 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.338204 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.338213 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.338225 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.338234 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.346579 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.356594 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.362560 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.362600 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.362701 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.362743 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:08:04.362731112 +0000 UTC m=+98.536437851 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.362798 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.362813 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.362822 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.362845 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:04.362837136 +0000 UTC m=+98.536543875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.370548 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.383917 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.396363 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.408290 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.419962 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.434085 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.440619 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.440647 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.440656 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.440671 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.440682 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.446823 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.461543 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.477155 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.486848 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.505429 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.516842 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.529065 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.542565 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.542750 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.543383 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.543438 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.543485 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.553338 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.553444 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.553662 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.553715 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.553663 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.553775 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.553630 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.553827 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.565785 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.577658 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.590964 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.599691 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.608503 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.619652 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.629462 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.646986 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.647029 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.647059 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.647075 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.647146 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.650062 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.663353 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.683485 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.695259 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.712865 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.722727 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.734360 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.746181 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.750004 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.750054 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.750066 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.750086 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.750098 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.758525 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.852406 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.852465 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.852484 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.852510 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.852527 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.868704 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.868749 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.868766 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.868789 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.868809 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.888062 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.892497 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.892550 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.892563 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.892582 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.892595 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.909862 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.913652 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.913692 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.913704 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.913720 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.913732 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.934193 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.937817 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.937844 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.937856 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.937892 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.937906 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.948059 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.951606 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.951639 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.951650 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.951665 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.951676 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.961483 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.961643 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.963764 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.963797 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.963810 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.963825 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.963836 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.043750 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.043843 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.043865 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.066288 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.066318 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.066327 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.066341 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.066351 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.070676 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.073815 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.081980 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.092294 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.103907 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.119735 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.128793 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.139331 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.150120 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.159811 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.168542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.168603 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.168621 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.168645 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.168662 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.172745 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.191005 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.202718 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.219645 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.233604 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.252674 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.265802 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.273890 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.273927 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.273936 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.273951 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.273960 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.288940 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.300700 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.318538 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.332815 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.351943 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.370536 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.375773 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.375903 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.375979 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.376062 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.376136 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.383565 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.397200 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.419484 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.436007 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.456636 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.477610 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.478774 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.478854 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.478887 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.478917 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.478934 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.492264 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.501698 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.515285 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.526690 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.542928 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.581671 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.581708 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.581721 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.581739 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.581754 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.684033 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.684073 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.684085 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.684103 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.684114 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.787397 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.787464 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.787483 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.787509 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.787527 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.891556 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.891607 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.891624 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.891650 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.891669 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.995180 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.995422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.995524 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.995604 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.995683 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.099814 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.099914 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.099934 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.099965 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.099983 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.204474 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.204542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.204560 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.204584 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.204601 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.307932 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.307975 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.307986 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.308005 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.308017 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.411482 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.411546 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.411563 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.411589 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.411606 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.514779 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.514855 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.514911 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.514938 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.514956 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.553709 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.553749 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.553951 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.553952 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:58 crc kubenswrapper[4994]: E0310 00:07:58.554373 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:07:58 crc kubenswrapper[4994]: E0310 00:07:58.554392 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:58 crc kubenswrapper[4994]: E0310 00:07:58.554483 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:58 crc kubenswrapper[4994]: E0310 00:07:58.554570 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.618398 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.618484 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.618511 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.618544 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.618570 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.721541 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.721579 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.721589 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.721603 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.721613 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.824421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.824493 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.824512 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.824537 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.824554 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.928316 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.928413 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.928488 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.928517 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.928536 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.031421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.031507 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.031534 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.031563 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.031580 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.051093 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/0.log" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.055577 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990" exitCode=1 Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.055638 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.056730 4994 scope.go:117] "RemoveContainer" containerID="a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.075802 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.104246 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:07:58Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI0310 00:07:58.268414 6831 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:07:58.268663 6831 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:07:58.268788 6831 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:07:58.269025 6831 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:07:58.269124 6831 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:07:58.269588 6831 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:07:58.269631 6831 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:07:58.269709 6831 factory.go:656] Stopping watch factory\\\\nI0310 00:07:58.269736 6831 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:07:58.269795 6831 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:07:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.124478 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.136081 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.137821 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.137900 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.137914 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.137932 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.137945 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.149864 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.166747 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.181536 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.197267 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.218688 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.236026 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.240490 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.240522 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.240535 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.240553 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.240566 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.258190 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.272659 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.284969 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.298691 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.310454 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.321683 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.343031 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.343092 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.343112 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.343137 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.343155 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.446242 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.446299 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.446316 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.446343 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.446360 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.548921 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.548977 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.548986 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.549002 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.549010 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.578447 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.651373 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.651412 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.651421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.651435 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.651444 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.753993 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.754029 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.754041 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.754056 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.754065 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.857601 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.857662 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.857681 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.857704 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.857721 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.960831 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.960868 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.960889 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.960903 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.960912 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.061947 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/1.log" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.062695 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/0.log" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.062754 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.063012 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.063132 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.063227 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.063313 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.066301 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c" exitCode=1 Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.066422 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.066490 4994 scope.go:117] "RemoveContainer" containerID="a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.067386 4994 scope.go:117] "RemoveContainer" containerID="eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c" Mar 10 00:08:00 crc kubenswrapper[4994]: E0310 00:08:00.067654 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.080297 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.099925 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:07:58Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI0310 00:07:58.268414 6831 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:07:58.268663 6831 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:07:58.268788 6831 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:07:58.269025 6831 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:07:58.269124 6831 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:07:58.269588 6831 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:07:58.269631 6831 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:07:58.269709 6831 factory.go:656] Stopping watch factory\\\\nI0310 00:07:58.269736 6831 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:07:58.269795 6831 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:07:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.110553 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.125977 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.140286 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.155386 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.166263 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.166391 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.166638 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.166651 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.166665 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.166684 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.178141 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.206607 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.229141 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.243469 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.263137 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.269444 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.269654 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.269794 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.269966 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.270120 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.281673 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.295031 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.315754 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.328573 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.343346 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.372320 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.372380 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.372402 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.372428 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.372447 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.475085 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.475125 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.475133 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.475150 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.475159 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.553798 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.553807 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.553832 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:00 crc kubenswrapper[4994]: E0310 00:08:00.553917 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.553842 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:00 crc kubenswrapper[4994]: E0310 00:08:00.554104 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:00 crc kubenswrapper[4994]: E0310 00:08:00.554240 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:00 crc kubenswrapper[4994]: E0310 00:08:00.554316 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.577532 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.577590 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.577610 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.577632 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.577651 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.680612 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.680681 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.680698 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.680728 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.680745 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.784010 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.784076 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.784097 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.784126 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.784149 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.886786 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.886840 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.886857 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.886909 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.886930 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.990277 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.990334 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.990351 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.990376 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.990391 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.072976 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/1.log" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.078078 4994 scope.go:117] "RemoveContainer" containerID="eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c" Mar 10 00:08:01 crc kubenswrapper[4994]: E0310 00:08:01.078251 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.091671 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.093572 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.093711 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.093812 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.093941 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.094027 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.106306 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.118600 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.135934 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.152546 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.165478 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.189527 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.199519 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.199572 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.199588 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.199610 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.199625 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.208669 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.227590 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.243418 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.261570 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.275823 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.296537 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.301972 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.302014 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.302029 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.302049 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.302063 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.312631 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.329237 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.349026 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.378761 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.405407 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.405470 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.405487 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.405510 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.405526 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.508349 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.508406 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.508423 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.508446 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.508463 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.612016 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.612060 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.612076 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.612099 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.612116 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.715666 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.716025 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.716213 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.716357 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.716500 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.820926 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.821061 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.821146 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.821240 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.821271 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.924237 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.924364 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.924387 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.924416 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.924437 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.026927 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.027102 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.027122 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.027150 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.027167 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.129508 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.129762 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.129890 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.129992 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.130078 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.232786 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.232911 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.232930 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.232952 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.232969 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.335852 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.335928 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.335945 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.335969 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.335987 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.439437 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.439497 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.439521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.439547 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.439564 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.542172 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.542521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.542659 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.542841 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.543023 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.553561 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.553609 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.553649 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.553619 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:02 crc kubenswrapper[4994]: E0310 00:08:02.553762 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:02 crc kubenswrapper[4994]: E0310 00:08:02.553986 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:02 crc kubenswrapper[4994]: E0310 00:08:02.554177 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:02 crc kubenswrapper[4994]: E0310 00:08:02.554228 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.645449 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.645531 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.645555 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.645583 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.645605 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.748414 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.748478 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.748502 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.748534 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.748579 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.851650 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.851747 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.851766 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.851824 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.851842 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.955271 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.955339 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.955355 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.955384 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.955400 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.059279 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.059366 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.059389 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.059418 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.059442 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.162986 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.163035 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.163054 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.163077 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.163095 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.266799 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.266865 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.267103 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.267132 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.267152 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.370257 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.370321 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.370339 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.370370 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.370388 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.473092 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.473185 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.473203 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.473226 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.473245 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.581772 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.581835 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.581935 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.582539 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.582589 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.686097 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.686152 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.686170 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.686193 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.686210 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.788987 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.789041 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.789063 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.789092 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.789114 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.892504 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.892590 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.892673 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.892708 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.892867 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.996026 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.996072 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.996083 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.996098 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.996123 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.101346 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.101386 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.101397 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.101411 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.101420 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.205322 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.205370 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.205386 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.205408 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.205425 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.308776 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.308822 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.308845 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.308909 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.308929 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.345773 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.345930 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.345970 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.346054 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.346152 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.346217 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:20.346195848 +0000 UTC m=+114.519902637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.346739 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:08:20.346714425 +0000 UTC m=+114.520421214 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.346901 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.346938 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.346980 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.347027 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:20.347013606 +0000 UTC m=+114.520720395 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.347103 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.347140 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:20.347128899 +0000 UTC m=+114.520835688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.415630 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.415713 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.415737 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.415765 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.415790 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.446725 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.446829 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.447097 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.447190 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:08:20.447160374 +0000 UTC m=+114.620867163 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.447752 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.447797 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.447821 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.447926 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:20.447901449 +0000 UTC m=+114.621608238 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.518962 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.519020 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.519040 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.519067 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.519088 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.553282 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.553452 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.553460 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.553543 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.553674 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.553758 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.553940 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.554116 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.622199 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.622239 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.622256 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.622300 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.622317 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.725537 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.725600 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.725623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.725653 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.725674 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.828642 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.828720 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.828756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.828786 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.828807 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.932063 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.932126 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.932144 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.932167 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.932184 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.035700 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.035785 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.035807 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.035837 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.035864 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.139418 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.139476 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.139492 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.139515 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.139531 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.243734 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.243803 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.243824 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.243849 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.243868 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.346519 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.346565 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.346578 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.346596 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.346609 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.449521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.449574 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.449591 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.449620 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.449640 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.551864 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.551943 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.551961 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.551983 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.552000 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.554411 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.655473 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.655538 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.655554 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.655578 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.655596 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.759416 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.759789 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.759810 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.759838 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.759857 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.864330 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.864396 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.864415 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.864450 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.864469 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.966780 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.966811 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.966820 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.966833 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.966841 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.070516 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.071808 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.072032 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.072221 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.072386 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.113243 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.116004 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.117093 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.138693 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.161811 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.175408 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.175466 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.175484 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.175511 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.175528 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.176522 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.189804 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.208469 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.223772 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.239802 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.259576 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.278698 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.278973 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.279067 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.279170 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.279258 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.293902 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.315274 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.331696 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.349387 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.369711 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.381994 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.382040 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.382049 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.382064 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.382075 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.384118 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.409000 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.432502 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.451189 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.484457 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.484531 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.484555 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.484587 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.484611 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.553329 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.553453 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:06 crc kubenswrapper[4994]: E0310 00:08:06.553564 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.553612 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.553634 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:06 crc kubenswrapper[4994]: E0310 00:08:06.553679 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:06 crc kubenswrapper[4994]: E0310 00:08:06.553723 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:06 crc kubenswrapper[4994]: E0310 00:08:06.554017 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.570564 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.587138 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.587182 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.587195 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.587212 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.587223 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.592117 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.624573 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.638709 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.655091 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.671416 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.689332 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.689403 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.689427 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.689457 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.689479 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.694535 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.704331 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.722774 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.745206 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.775193 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.792962 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.793089 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.793142 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.793164 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.793193 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.793215 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.806655 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.824743 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.838934 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.851162 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.868361 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.895622 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.895779 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.896096 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.896299 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.896388 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.971638 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.971693 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.971709 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.971733 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.971751 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: E0310 00:08:06.993610 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.998973 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.999216 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.999235 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.999649 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.999709 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: E0310 00:08:07.022267 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.027163 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.027219 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.027237 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.027262 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.027279 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: E0310 00:08:07.048004 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.052740 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.052793 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.052810 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.052834 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.052852 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: E0310 00:08:07.072762 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.078044 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.078111 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.078131 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.078156 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.078173 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: E0310 00:08:07.098744 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4994]: E0310 00:08:07.099644 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.101862 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.102140 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.102362 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.102555 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.102757 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.206570 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.206647 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.206666 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.206691 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.206709 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.309821 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.309924 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.309945 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.309979 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.309997 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.415023 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.415082 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.415100 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.415123 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.415141 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.517962 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.518418 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.518635 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.518910 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.519069 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.622800 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.622852 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.622869 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.622918 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.622935 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.726648 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.726705 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.726723 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.726750 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.726767 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.830964 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.831017 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.831035 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.831065 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.831083 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.934318 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.934660 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.934806 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.934998 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.935139 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.037934 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.038011 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.038037 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.038065 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.038086 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.140752 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.140805 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.140822 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.140846 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.140864 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.243541 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.243595 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.243612 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.243633 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.243650 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.346777 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.346823 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.346832 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.346849 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.346860 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.449562 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.449600 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.449609 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.449623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.449633 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.553039 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.553165 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:08 crc kubenswrapper[4994]: E0310 00:08:08.553231 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:08 crc kubenswrapper[4994]: E0310 00:08:08.553390 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.553514 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:08 crc kubenswrapper[4994]: E0310 00:08:08.553603 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.553988 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:08 crc kubenswrapper[4994]: E0310 00:08:08.554186 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.554573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.554814 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.555084 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.555321 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.555542 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.658477 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.658912 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.659054 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.659191 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.659354 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.762298 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.762690 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.762961 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.763146 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.763284 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.866204 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.866525 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.866653 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.866790 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.866987 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.970566 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.970620 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.970637 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.970660 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.970676 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.074344 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.074405 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.074422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.074448 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.074465 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.177454 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.177534 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.177554 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.177580 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.177599 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.280763 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.280821 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.280837 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.280862 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.280901 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.383778 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.383835 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.383851 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.383893 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.383910 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.487756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.488107 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.488279 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.488422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.488547 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.591325 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.591659 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.591803 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.591982 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.592107 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.695620 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.695685 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.695708 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.695736 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.695760 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.798136 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.798188 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.798206 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.798231 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.798248 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.900858 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.900990 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.901008 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.901034 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.901052 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.004997 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.005063 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.005080 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.005102 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.005124 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.107987 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.108030 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.108042 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.108058 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.108069 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.216218 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.216287 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.216304 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.216327 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.216343 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.320177 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.320238 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.320259 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.320284 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.320301 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.423184 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.423263 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.423288 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.423316 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.423339 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.526293 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.526367 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.526392 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.526422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.526447 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.553011 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.553065 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.553104 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:10 crc kubenswrapper[4994]: E0310 00:08:10.553200 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.553235 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:10 crc kubenswrapper[4994]: E0310 00:08:10.553391 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:10 crc kubenswrapper[4994]: E0310 00:08:10.553557 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:10 crc kubenswrapper[4994]: E0310 00:08:10.553668 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.629425 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.629475 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.629492 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.629515 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.629533 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.732571 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.732940 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.733110 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.733254 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.733386 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.836934 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.836992 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.837010 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.837035 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.837051 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.940133 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.940214 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.940237 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.940266 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.940291 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.044455 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.044528 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.044552 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.044666 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.044694 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.147158 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.147545 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.147706 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.147846 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.148018 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.251227 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.251962 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.251995 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.252021 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.252042 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.354759 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.354824 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.354843 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.354908 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.354928 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.458162 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.458223 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.458239 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.458264 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.458283 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.560653 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.560701 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.560712 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.560728 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.560741 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.663571 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.663624 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.663642 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.663667 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.663684 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.766808 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.766862 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.766912 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.766935 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.766951 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.869952 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.870009 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.870027 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.870051 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.870070 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.973675 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.973739 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.973756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.973782 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.973801 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.076406 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.076469 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.076486 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.076512 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.076530 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.179402 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.179481 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.179502 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.179526 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.179545 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.282658 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.282733 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.282753 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.282778 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.282799 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.386321 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.386369 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.386380 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.386400 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.386411 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.489028 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.489131 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.489158 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.489263 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.489293 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.553661 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.553699 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.553746 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:12 crc kubenswrapper[4994]: E0310 00:08:12.553826 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:12 crc kubenswrapper[4994]: E0310 00:08:12.553947 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:12 crc kubenswrapper[4994]: E0310 00:08:12.554056 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.554013 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:12 crc kubenswrapper[4994]: E0310 00:08:12.554302 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.592565 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.592626 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.592643 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.592666 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.592685 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.695683 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.695732 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.695744 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.695762 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.695773 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.798753 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.798809 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.798826 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.798851 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.798867 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.902122 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.902166 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.902177 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.902199 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.902211 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.005350 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.005395 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.005405 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.005423 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.005434 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.108235 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.108295 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.108311 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.108342 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.108360 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.211606 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.211684 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.211707 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.211735 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.211755 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.314944 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.315026 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.315046 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.315071 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.315087 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.418313 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.418402 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.418419 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.418441 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.418459 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.521779 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.521850 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.521910 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.522007 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.522032 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.624247 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.624282 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.624290 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.624303 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.624312 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.726771 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.726809 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.726816 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.726831 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.726840 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.829481 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.829539 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.829556 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.829579 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.829596 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.932259 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.932309 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.932328 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.932353 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.932371 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.035793 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.035852 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.035945 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.035972 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.035990 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.138793 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.138861 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.138941 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.138971 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.138990 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.242178 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.242265 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.242282 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.242310 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.242328 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.345171 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.345238 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.345255 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.345277 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.345297 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.448585 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.448647 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.448666 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.448695 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.448714 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.551064 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.551118 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.551135 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.551159 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.551176 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.553526 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.553575 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.553565 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.553682 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:14 crc kubenswrapper[4994]: E0310 00:08:14.553849 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:14 crc kubenswrapper[4994]: E0310 00:08:14.554048 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:14 crc kubenswrapper[4994]: E0310 00:08:14.554183 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:14 crc kubenswrapper[4994]: E0310 00:08:14.554291 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.653472 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.653500 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.653508 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.653521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.653529 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.756396 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.756453 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.756475 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.756503 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.756526 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.860008 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.860061 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.860077 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.860100 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.860116 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.962525 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.962613 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.962660 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.962685 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.962702 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.065323 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.065375 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.065388 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.065407 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.065425 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.168421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.168492 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.168517 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.168546 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.168568 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.271630 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.271701 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.271725 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.271756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.271777 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.374267 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.374360 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.374378 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.374787 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.374839 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.478065 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.478113 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.478133 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.478156 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.478173 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.555298 4994 scope.go:117] "RemoveContainer" containerID="eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.580923 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.581225 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.581243 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.581266 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.581282 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.684172 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.684233 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.684254 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.684278 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.684295 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.786595 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.786642 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.786654 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.786671 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.786682 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.889636 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.889686 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.889701 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.889718 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.889730 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.992568 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.992658 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.992685 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.992716 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.992742 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.094861 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.094939 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.094952 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.094965 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.094974 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.158846 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/1.log" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.163141 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.163863 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.176776 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.186657 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.197567 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.197623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.197643 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.197673 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.197696 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.204833 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.218984 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.235130 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.250759 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.278411 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.292446 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.301049 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.301118 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.301140 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.301168 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.301190 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.304756 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.317508 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.330250 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.341700 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.357498 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.386063 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.402538 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.404380 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.404419 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.404429 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.404444 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.404453 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.421410 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.439335 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.506630 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.506705 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.506731 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.506758 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.506808 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.553752 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.553968 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.554007 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:16 crc kubenswrapper[4994]: E0310 00:08:16.553959 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:16 crc kubenswrapper[4994]: E0310 00:08:16.554236 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.554291 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:16 crc kubenswrapper[4994]: E0310 00:08:16.554431 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:16 crc kubenswrapper[4994]: E0310 00:08:16.554550 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.572546 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.596024 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.609530 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.609564 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.609572 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.609584 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.609592 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.613674 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.624713 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.637518 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.659114 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.677947 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.696914 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.714397 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.714441 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.714450 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.714465 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.714476 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.719539 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.740999 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.758580 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.776781 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.794264 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.810048 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.816928 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.817017 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.817043 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.817075 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.817104 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.834802 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.845910 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.859667 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.919893 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.919942 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.919953 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.919968 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.919980 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.022747 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.022866 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.022939 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.022969 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.022991 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.126212 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.126261 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.126273 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.126290 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.126301 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.169121 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/2.log" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.169762 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/1.log" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.175484 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd" exitCode=1 Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.175544 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.175593 4994 scope.go:117] "RemoveContainer" containerID="eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.176183 4994 scope.go:117] "RemoveContainer" containerID="e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.176324 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.198437 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.229059 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.229447 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.229480 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.229491 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.229508 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.229521 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.247542 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.262151 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.268869 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.268990 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.269016 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.269047 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.269071 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.281081 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.289676 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.297274 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.297335 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.297354 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.297383 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.297402 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.297917 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.314010 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.317706 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.319714 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.319789 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.319813 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.320074 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.320125 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.335971 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.341640 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.346568 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.346660 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.346678 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.346702 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.346720 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.356144 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.371259 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.376690 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.376745 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.376764 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.376787 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.376806 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.381094 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.399165 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.399502 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.401761 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.401807 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.401824 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.401844 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.401860 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.413218 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.432935 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.445734 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.461476 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.476469 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.490689 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.505143 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.505189 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.505209 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.505232 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.505249 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.510514 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.608605 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.608680 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.608701 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.608734 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.608756 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.711853 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.711942 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.711959 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.711983 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.712002 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.815185 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.815246 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.815264 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.815291 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.815308 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.918056 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.918096 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.918105 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.918120 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.918131 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.021738 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.021791 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.021808 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.021831 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.021849 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.125094 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.125167 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.125187 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.125212 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.125233 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.181506 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/2.log" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.187372 4994 scope.go:117] "RemoveContainer" containerID="e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd" Mar 10 00:08:18 crc kubenswrapper[4994]: E0310 00:08:18.187631 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.204820 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.228395 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.229103 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.229172 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.229196 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.229228 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.229251 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.247553 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.267927 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.284758 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.300567 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.332263 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.332320 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.332342 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.332372 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.332390 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.335714 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.358843 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.377746 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.392376 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.411851 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.422554 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.435110 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.435188 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.435210 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.435238 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.435260 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.445513 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.460143 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.477713 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.498005 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.529657 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.537685 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.537759 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.537782 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.537807 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.537824 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.553669 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.553698 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:18 crc kubenswrapper[4994]: E0310 00:08:18.553853 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.553897 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.553985 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:18 crc kubenswrapper[4994]: E0310 00:08:18.554143 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:18 crc kubenswrapper[4994]: E0310 00:08:18.554245 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:18 crc kubenswrapper[4994]: E0310 00:08:18.554424 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.639998 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.640040 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.640052 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.640068 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.640082 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.742715 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.742766 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.742782 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.742804 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.742821 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.846119 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.846173 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.846190 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.846213 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.846229 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.949184 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.949251 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.949268 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.949293 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.949310 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.052503 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.052573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.052595 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.052621 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.052640 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.155041 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.155100 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.155119 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.155147 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.155165 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.257848 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.257938 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.257958 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.257984 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.258002 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.360901 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.360951 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.360969 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.360996 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.361014 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.464310 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.464370 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.464388 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.464416 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.464433 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.567945 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.568004 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.568026 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.568055 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.568077 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.670974 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.671044 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.671066 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.671095 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.671116 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.773864 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.773981 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.774001 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.774028 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.774047 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.877304 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.877446 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.877468 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.877501 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.877519 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.980557 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.980632 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.980651 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.980674 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.980691 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.084059 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.084111 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.084128 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.084150 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.084167 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.187247 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.187623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.187779 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.188012 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.188224 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.291657 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.291718 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.291740 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.291772 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.291793 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.367993 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368187 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:08:52.368149771 +0000 UTC m=+146.541856580 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.368291 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.368373 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.368427 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368483 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368583 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:52.368556815 +0000 UTC m=+146.542263604 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368590 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368661 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:52.368639688 +0000 UTC m=+146.542346467 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368662 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368737 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368758 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368862 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:52.368828265 +0000 UTC m=+146.542535044 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.395248 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.395303 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.395325 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.395356 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.395385 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.470322 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.470553 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.470627 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.470776 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:08:52.470738973 +0000 UTC m=+146.644445912 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.470846 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.470928 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.470958 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.471098 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:52.471058504 +0000 UTC m=+146.644765453 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.498923 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.498997 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.499018 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.499052 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.499082 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.553067 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.553242 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.553416 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.553415 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.553517 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.553666 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.553905 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.554041 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.601683 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.601733 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.601755 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.601784 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.601807 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.706349 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.706423 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.706448 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.706481 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.706509 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.809830 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.809937 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.809962 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.809991 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.810015 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.912281 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.912355 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.912382 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.912413 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.912432 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.017531 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.017597 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.017615 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.017641 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.017658 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.120319 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.120382 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.120400 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.120426 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.120448 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.223623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.223712 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.223735 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.223759 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.223819 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.332502 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.332582 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.332603 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.332696 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.332761 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.436902 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.436999 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.437019 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.437046 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.437067 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.541216 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.541292 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.541312 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.541337 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.541356 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.644576 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.644636 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.644653 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.644678 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.644695 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.749623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.749682 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.749700 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.749725 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.749743 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.852659 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.852734 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.852760 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.852832 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.852861 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.957052 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.957117 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.957135 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.957160 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.957180 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.059941 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.060003 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.060021 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.060053 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.060071 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.162700 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.162782 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.162802 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.162826 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.162843 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.265988 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.266025 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.266056 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.266071 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.266080 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.369408 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.369486 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.369512 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.369545 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.369567 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.473422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.473523 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.473546 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.473569 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.473587 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.554052 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:22 crc kubenswrapper[4994]: E0310 00:08:22.554285 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.554984 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:22 crc kubenswrapper[4994]: E0310 00:08:22.555085 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.555164 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:22 crc kubenswrapper[4994]: E0310 00:08:22.555258 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.555426 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:22 crc kubenswrapper[4994]: E0310 00:08:22.555517 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.577371 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.577431 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.577451 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.577473 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.577493 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.680865 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.680965 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.680982 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.681008 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.681027 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.785173 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.785242 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.785263 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.785288 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.785305 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.889963 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.890030 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.890050 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.890075 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.890094 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.994089 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.994147 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.994164 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.994187 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.994205 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.097866 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.097957 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.097975 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.098002 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.098021 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.202330 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.202393 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.202413 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.202441 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.202458 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.305787 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.305835 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.305854 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.305910 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.305928 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.409497 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.409547 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.409563 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.409586 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.409603 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.513649 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.513705 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.513722 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.513753 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.513770 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.617288 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.617357 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.617376 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.617401 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.617420 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.720757 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.720841 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.720862 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.720923 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.720947 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.824559 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.824629 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.824650 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.824676 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.824692 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.843844 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.864507 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.886201 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.902130 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.925501 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.927570 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.927662 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.927695 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.927731 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.927757 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.941354 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.959570 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.985255 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.005058 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.017830 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.027543 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.030987 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.031152 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.031171 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.031198 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.031240 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.041749 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.060497 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.078228 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.097918 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.123144 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.134387 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.134421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.134432 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.134448 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.134459 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.145050 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.163460 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.237081 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.237205 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.237226 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.237250 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.237268 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.340679 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.340738 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.340756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.340783 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.340800 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.443354 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.443410 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.443425 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.443445 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.443460 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.546763 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.546847 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.546860 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.546910 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.546925 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.553334 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.553373 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.553507 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.553597 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:24 crc kubenswrapper[4994]: E0310 00:08:24.553516 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:24 crc kubenswrapper[4994]: E0310 00:08:24.553756 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:24 crc kubenswrapper[4994]: E0310 00:08:24.553826 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:24 crc kubenswrapper[4994]: E0310 00:08:24.554007 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.651129 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.651205 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.651225 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.651251 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.651269 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.754611 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.754664 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.754684 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.754711 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.754729 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.857181 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.857602 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.857620 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.857645 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.857665 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.960281 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.960334 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.960350 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.960374 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.960390 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.063781 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.063859 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.063913 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.063939 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.063957 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.166551 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.166619 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.166638 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.166659 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.166676 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.269531 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.269573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.269585 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.269599 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.269608 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.373221 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.373561 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.373704 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.373858 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.374057 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.476899 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.477003 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.477023 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.477047 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.477064 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.580018 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.580096 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.580114 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.580141 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.580159 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.682707 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.682838 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.682859 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.682912 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.682934 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.789047 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.789224 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.789244 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.789317 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.789338 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.893074 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.893133 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.893149 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.893173 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.893190 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.996542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.996647 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.996670 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.996738 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.996796 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.101084 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.101171 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.101193 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.101225 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.101255 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.204137 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.204201 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.204218 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.204243 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.204261 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.307151 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.307428 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.307601 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.307751 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.307867 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.410952 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.411030 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.411049 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.411073 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.411094 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4994]: E0310 00:08:26.516299 4994 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.553489 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.553866 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:26 crc kubenswrapper[4994]: E0310 00:08:26.554574 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.554024 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:26 crc kubenswrapper[4994]: E0310 00:08:26.554643 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.553965 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:26 crc kubenswrapper[4994]: E0310 00:08:26.555335 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:26 crc kubenswrapper[4994]: E0310 00:08:26.555065 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.577960 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.611605 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.631867 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.654109 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: E0310 00:08:26.656439 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.675454 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.691680 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.709604 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.731244 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.752538 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.774095 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.807758 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.830048 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.851041 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.870198 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.891407 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.907698 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.933903 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.641517 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.641579 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.641600 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.641628 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.641648 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:27Z","lastTransitionTime":"2026-03-10T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:27 crc kubenswrapper[4994]: E0310 00:08:27.663640 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.669353 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.669410 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.669428 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.669452 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.669512 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:27Z","lastTransitionTime":"2026-03-10T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:27 crc kubenswrapper[4994]: E0310 00:08:27.690079 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.695421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.695486 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.695506 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.695532 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.695549 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:27Z","lastTransitionTime":"2026-03-10T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:27 crc kubenswrapper[4994]: E0310 00:08:27.714767 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.719939 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.720059 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.720089 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.720119 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.720141 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:27Z","lastTransitionTime":"2026-03-10T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:27 crc kubenswrapper[4994]: E0310 00:08:27.740244 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.745349 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.745421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.745445 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.745473 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.745495 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:27Z","lastTransitionTime":"2026-03-10T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:27 crc kubenswrapper[4994]: E0310 00:08:27.765769 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4994]: E0310 00:08:27.766017 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:28 crc kubenswrapper[4994]: I0310 00:08:28.553309 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:28 crc kubenswrapper[4994]: E0310 00:08:28.553477 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:28 crc kubenswrapper[4994]: I0310 00:08:28.553746 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:28 crc kubenswrapper[4994]: E0310 00:08:28.553847 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:28 crc kubenswrapper[4994]: I0310 00:08:28.554262 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:28 crc kubenswrapper[4994]: I0310 00:08:28.554396 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:28 crc kubenswrapper[4994]: E0310 00:08:28.554588 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:28 crc kubenswrapper[4994]: E0310 00:08:28.554750 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:30 crc kubenswrapper[4994]: I0310 00:08:30.553435 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:30 crc kubenswrapper[4994]: E0310 00:08:30.553624 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:30 crc kubenswrapper[4994]: I0310 00:08:30.553993 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:30 crc kubenswrapper[4994]: E0310 00:08:30.554112 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:30 crc kubenswrapper[4994]: I0310 00:08:30.554209 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:30 crc kubenswrapper[4994]: I0310 00:08:30.554225 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:30 crc kubenswrapper[4994]: E0310 00:08:30.554389 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:30 crc kubenswrapper[4994]: E0310 00:08:30.554537 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:31 crc kubenswrapper[4994]: E0310 00:08:31.658251 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:32 crc kubenswrapper[4994]: I0310 00:08:32.553510 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:32 crc kubenswrapper[4994]: I0310 00:08:32.553625 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:32 crc kubenswrapper[4994]: I0310 00:08:32.553525 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:32 crc kubenswrapper[4994]: I0310 00:08:32.553560 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:32 crc kubenswrapper[4994]: E0310 00:08:32.553740 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:32 crc kubenswrapper[4994]: E0310 00:08:32.553843 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:32 crc kubenswrapper[4994]: E0310 00:08:32.554107 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:32 crc kubenswrapper[4994]: E0310 00:08:32.554187 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:33 crc kubenswrapper[4994]: I0310 00:08:33.554466 4994 scope.go:117] "RemoveContainer" containerID="e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd" Mar 10 00:08:33 crc kubenswrapper[4994]: E0310 00:08:33.554770 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:34 crc kubenswrapper[4994]: I0310 00:08:34.553847 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:34 crc kubenswrapper[4994]: I0310 00:08:34.553941 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:34 crc kubenswrapper[4994]: I0310 00:08:34.553949 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:34 crc kubenswrapper[4994]: I0310 00:08:34.553861 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:34 crc kubenswrapper[4994]: E0310 00:08:34.554039 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:34 crc kubenswrapper[4994]: E0310 00:08:34.554202 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:34 crc kubenswrapper[4994]: E0310 00:08:34.554319 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:34 crc kubenswrapper[4994]: E0310 00:08:34.554443 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.254594 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/0.log" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.255711 4994 generic.go:334] "Generic (PLEG): container finished" podID="6dac87a5-07eb-488d-85fe-cb8848434ae5" containerID="5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c" exitCode=1 Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.255842 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerDied","Data":"5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c"} Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.256592 4994 scope.go:117] "RemoveContainer" containerID="5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.281393 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.300225 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.317555 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.349824 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.372105 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.387815 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.404686 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.420909 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.435423 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.452895 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.482430 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.503413 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.521611 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.541640 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.553904 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.553961 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.554040 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.554183 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:36 crc kubenswrapper[4994]: E0310 00:08:36.554272 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:36 crc kubenswrapper[4994]: E0310 00:08:36.554379 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:36 crc kubenswrapper[4994]: E0310 00:08:36.554514 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:36 crc kubenswrapper[4994]: E0310 00:08:36.554634 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.559736 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.572907 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.585429 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.602716 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.617897 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.634955 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.650779 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: E0310 00:08:36.658865 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.666610 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.681123 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.717258 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.738051 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.755713 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.772702 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.790334 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.816865 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.834641 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.847895 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.862816 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.890479 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.909007 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.263029 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/0.log" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.263127 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerStarted","Data":"04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106"} Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.300515 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.322433 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.343967 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.362791 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.382517 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.402589 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.418488 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.436499 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.472486 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.493912 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.513830 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.534864 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.551340 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.574118 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.590412 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.607781 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.627700 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.796386 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.796439 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.796457 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.796483 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.796505 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:37Z","lastTransitionTime":"2026-03-10T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:37 crc kubenswrapper[4994]: E0310 00:08:37.819672 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.824899 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.824954 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.824975 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.825002 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.825023 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:37Z","lastTransitionTime":"2026-03-10T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:37 crc kubenswrapper[4994]: E0310 00:08:37.846464 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.850777 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.850841 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.850862 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.850918 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.850937 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:37Z","lastTransitionTime":"2026-03-10T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:37 crc kubenswrapper[4994]: E0310 00:08:37.868622 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.874529 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.874604 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.874626 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.874660 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.874682 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:37Z","lastTransitionTime":"2026-03-10T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:37 crc kubenswrapper[4994]: E0310 00:08:37.894518 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.899425 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.899482 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.899502 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.899531 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.899548 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:37Z","lastTransitionTime":"2026-03-10T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:37 crc kubenswrapper[4994]: E0310 00:08:37.919124 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: E0310 00:08:37.919352 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:38 crc kubenswrapper[4994]: I0310 00:08:38.553341 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:38 crc kubenswrapper[4994]: I0310 00:08:38.553452 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:38 crc kubenswrapper[4994]: I0310 00:08:38.553360 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:38 crc kubenswrapper[4994]: E0310 00:08:38.553565 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:38 crc kubenswrapper[4994]: I0310 00:08:38.553606 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:38 crc kubenswrapper[4994]: E0310 00:08:38.553951 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:38 crc kubenswrapper[4994]: E0310 00:08:38.554029 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:38 crc kubenswrapper[4994]: E0310 00:08:38.554073 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:40 crc kubenswrapper[4994]: I0310 00:08:40.553138 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:40 crc kubenswrapper[4994]: I0310 00:08:40.553198 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:40 crc kubenswrapper[4994]: I0310 00:08:40.553138 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:40 crc kubenswrapper[4994]: I0310 00:08:40.553297 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:40 crc kubenswrapper[4994]: E0310 00:08:40.553493 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:40 crc kubenswrapper[4994]: E0310 00:08:40.553923 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:40 crc kubenswrapper[4994]: E0310 00:08:40.554180 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:40 crc kubenswrapper[4994]: E0310 00:08:40.554347 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:40 crc kubenswrapper[4994]: I0310 00:08:40.569143 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 00:08:41 crc kubenswrapper[4994]: E0310 00:08:41.660945 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:42 crc kubenswrapper[4994]: I0310 00:08:42.553513 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:42 crc kubenswrapper[4994]: I0310 00:08:42.553644 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:42 crc kubenswrapper[4994]: E0310 00:08:42.553714 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:42 crc kubenswrapper[4994]: I0310 00:08:42.553738 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:42 crc kubenswrapper[4994]: E0310 00:08:42.553911 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:42 crc kubenswrapper[4994]: I0310 00:08:42.553952 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:42 crc kubenswrapper[4994]: E0310 00:08:42.554031 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:42 crc kubenswrapper[4994]: E0310 00:08:42.554169 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:44 crc kubenswrapper[4994]: I0310 00:08:44.553367 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:44 crc kubenswrapper[4994]: I0310 00:08:44.553397 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:44 crc kubenswrapper[4994]: I0310 00:08:44.553489 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:44 crc kubenswrapper[4994]: E0310 00:08:44.553549 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:44 crc kubenswrapper[4994]: E0310 00:08:44.553714 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:44 crc kubenswrapper[4994]: I0310 00:08:44.553755 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:44 crc kubenswrapper[4994]: E0310 00:08:44.553955 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:44 crc kubenswrapper[4994]: E0310 00:08:44.554396 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:44 crc kubenswrapper[4994]: I0310 00:08:44.568449 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.553131 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.553231 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:46 crc kubenswrapper[4994]: E0310 00:08:46.553306 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.553320 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.553338 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:46 crc kubenswrapper[4994]: E0310 00:08:46.553440 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:46 crc kubenswrapper[4994]: E0310 00:08:46.554253 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:46 crc kubenswrapper[4994]: E0310 00:08:46.554376 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.554534 4994 scope.go:117] "RemoveContainer" containerID="e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.589469 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.609617 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.630794 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45a7a43f-ba45-4f54-92a9-6fae6144cd7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:28.788269 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:28.790599 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:28.823559 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:28.826417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:58.399257 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:58.399347 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.650041 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: E0310 00:08:46.663597 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.674293 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.693550 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.711282 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.724805 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.736746 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.768984 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.787965 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.800847 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.815912 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.833498 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d23bec6-0598-4970-8f25-e80867c2ad16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.847920 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.867685 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.880157 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.896526 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.914970 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.302009 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/2.log" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.305099 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.305575 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.319662 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.343326 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.364122 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45a7a43f-ba45-4f54-92a9-6fae6144cd7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:28.788269 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:28.790599 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:28.823559 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:28.826417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:58.399257 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:58.399347 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.378206 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.392897 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.406853 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.418329 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.429265 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d23bec6-0598-4970-8f25-e80867c2ad16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.446117 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.467645 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.481733 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.498699 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.509198 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.520479 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.533998 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.544437 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.555904 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.569191 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.588834 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.180418 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.180463 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.180473 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.180489 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.180502 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:48Z","lastTransitionTime":"2026-03-10T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.199040 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.203921 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.203969 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.203984 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.204002 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.204013 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:48Z","lastTransitionTime":"2026-03-10T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.217893 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.221884 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.221917 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.221925 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.221939 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.221948 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:48Z","lastTransitionTime":"2026-03-10T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.239732 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.244218 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.244261 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.244273 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.244289 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.244302 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:48Z","lastTransitionTime":"2026-03-10T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.261417 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.264822 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.264850 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.264859 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.264893 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.264903 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:48Z","lastTransitionTime":"2026-03-10T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.277280 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.277395 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.310334 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/3.log" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.311100 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/2.log" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.313927 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" exitCode=1 Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.313973 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.314016 4994 scope.go:117] "RemoveContainer" containerID="e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.315327 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.315553 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.334010 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.350527 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.368514 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.382412 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.398629 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.418208 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.446341 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 00:08:47.724956 7508 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI0310 00:08:47.724964 7508 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0310 00:08:47.724977 7508 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF0310 00:08:47.724982 7508 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.459764 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.479163 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45a7a43f-ba45-4f54-92a9-6fae6144cd7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:28.788269 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:28.790599 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:28.823559 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:28.826417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:58.399257 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:58.399347 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.503989 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.523292 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.540818 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.554128 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.554233 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.554269 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.554463 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.554550 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.554619 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.554725 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.554801 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.559234 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.575280 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.590353 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d23bec6-0598-4970-8f25-e80867c2ad16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.614262 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.635616 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.652350 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.670490 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.321622 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/3.log" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.327568 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:08:49 crc kubenswrapper[4994]: E0310 00:08:49.327949 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.347212 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.366739 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.385364 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d23bec6-0598-4970-8f25-e80867c2ad16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.416220 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.438900 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.455211 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.472259 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.492746 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.507473 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.530134 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.550077 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.580624 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 00:08:47.724956 7508 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI0310 00:08:47.724964 7508 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0310 00:08:47.724977 7508 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF0310 00:08:47.724982 7508 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.598817 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.618148 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.633931 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.646464 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.662670 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.683246 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45a7a43f-ba45-4f54-92a9-6fae6144cd7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:28.788269 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:28.790599 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:28.823559 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:28.826417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:58.399257 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:58.399347 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.733241 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:50 crc kubenswrapper[4994]: I0310 00:08:50.553228 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:50 crc kubenswrapper[4994]: I0310 00:08:50.553332 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:50 crc kubenswrapper[4994]: E0310 00:08:50.553433 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:50 crc kubenswrapper[4994]: I0310 00:08:50.553454 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:50 crc kubenswrapper[4994]: E0310 00:08:50.553622 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:50 crc kubenswrapper[4994]: I0310 00:08:50.553751 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:50 crc kubenswrapper[4994]: E0310 00:08:50.554030 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:50 crc kubenswrapper[4994]: E0310 00:08:50.554120 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:51 crc kubenswrapper[4994]: E0310 00:08:51.664634 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.414493 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.414744 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.414832 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.414789359 +0000 UTC m=+210.588496148 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.414869 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.414966 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415011 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.414983403 +0000 UTC m=+210.588690182 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.415049 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415144 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415200 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415224 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415227 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415298 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.41528213 +0000 UTC m=+210.588988909 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415359 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.41531892 +0000 UTC m=+210.589025679 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.516689 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.516943 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.517035 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.517217 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.517177258 +0000 UTC m=+210.690884187 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.517246 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.517304 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.517326 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.517440 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.517406253 +0000 UTC m=+210.691113152 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.553597 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.553643 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.553659 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.553660 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.553935 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.554083 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.554297 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.554395 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:54 crc kubenswrapper[4994]: I0310 00:08:54.553513 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:54 crc kubenswrapper[4994]: E0310 00:08:54.554168 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:54 crc kubenswrapper[4994]: I0310 00:08:54.553692 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:54 crc kubenswrapper[4994]: E0310 00:08:54.554288 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:54 crc kubenswrapper[4994]: I0310 00:08:54.553706 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:54 crc kubenswrapper[4994]: E0310 00:08:54.554406 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:54 crc kubenswrapper[4994]: I0310 00:08:54.553611 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:54 crc kubenswrapper[4994]: E0310 00:08:54.554494 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.553341 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.553448 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:56 crc kubenswrapper[4994]: E0310 00:08:56.553572 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.553380 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:56 crc kubenswrapper[4994]: E0310 00:08:56.553743 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.553852 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:56 crc kubenswrapper[4994]: E0310 00:08:56.553996 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:56 crc kubenswrapper[4994]: E0310 00:08:56.554088 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.575493 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.598627 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.618670 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.635907 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.652029 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: E0310 00:08:56.665279 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.667589 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.690992 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45a7a43f-ba45-4f54-92a9-6fae6144cd7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:28.788269 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:28.790599 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:28.823559 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:28.826417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:58.399257 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:58.399347 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.712148 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.731560 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.770720 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.794689 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d23bec6-0598-4970-8f25-e80867c2ad16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.828259 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.851758 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.871275 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.888176 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.907626 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.920314 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.939024 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.970905 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 00:08:47.724956 7508 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI0310 00:08:47.724964 7508 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0310 00:08:47.724977 7508 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF0310 00:08:47.724982 7508 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.546278 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.546333 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.546378 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.546398 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.546409 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:58Z","lastTransitionTime":"2026-03-10T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.554352 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.554358 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.554510 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.554700 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.554977 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.555059 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.555232 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.556700 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.567312 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.573712 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.573767 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.573788 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.573817 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.573836 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:58Z","lastTransitionTime":"2026-03-10T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.595186 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.600366 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.600417 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.600428 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.600454 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.600470 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:58Z","lastTransitionTime":"2026-03-10T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.620632 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.625752 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.625808 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.625826 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.625856 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.625905 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:58Z","lastTransitionTime":"2026-03-10T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.647827 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.653705 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.653762 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.653780 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.653804 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.653821 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:58Z","lastTransitionTime":"2026-03-10T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.674838 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.675122 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:09:00 crc kubenswrapper[4994]: I0310 00:09:00.553828 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:00 crc kubenswrapper[4994]: I0310 00:09:00.553926 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:00 crc kubenswrapper[4994]: I0310 00:09:00.553850 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:00 crc kubenswrapper[4994]: I0310 00:09:00.554003 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:00 crc kubenswrapper[4994]: E0310 00:09:00.554054 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:00 crc kubenswrapper[4994]: E0310 00:09:00.554282 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:00 crc kubenswrapper[4994]: E0310 00:09:00.554509 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:00 crc kubenswrapper[4994]: E0310 00:09:00.554579 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:01 crc kubenswrapper[4994]: E0310 00:09:01.666550 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:02 crc kubenswrapper[4994]: I0310 00:09:02.553335 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:02 crc kubenswrapper[4994]: I0310 00:09:02.553460 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:02 crc kubenswrapper[4994]: E0310 00:09:02.553748 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:02 crc kubenswrapper[4994]: I0310 00:09:02.553799 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:02 crc kubenswrapper[4994]: I0310 00:09:02.553833 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:02 crc kubenswrapper[4994]: E0310 00:09:02.554058 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:02 crc kubenswrapper[4994]: E0310 00:09:02.554497 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:02 crc kubenswrapper[4994]: E0310 00:09:02.554746 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:02 crc kubenswrapper[4994]: I0310 00:09:02.555925 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:09:02 crc kubenswrapper[4994]: E0310 00:09:02.556176 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:09:04 crc kubenswrapper[4994]: I0310 00:09:04.553584 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:04 crc kubenswrapper[4994]: I0310 00:09:04.553693 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:04 crc kubenswrapper[4994]: E0310 00:09:04.553779 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:04 crc kubenswrapper[4994]: E0310 00:09:04.553908 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:04 crc kubenswrapper[4994]: I0310 00:09:04.554029 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:04 crc kubenswrapper[4994]: E0310 00:09:04.554172 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:04 crc kubenswrapper[4994]: I0310 00:09:04.554419 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:04 crc kubenswrapper[4994]: E0310 00:09:04.554588 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.553423 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.553610 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.553855 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:06 crc kubenswrapper[4994]: E0310 00:09:06.553845 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.553932 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:06 crc kubenswrapper[4994]: E0310 00:09:06.554065 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:06 crc kubenswrapper[4994]: E0310 00:09:06.554485 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:06 crc kubenswrapper[4994]: E0310 00:09:06.554626 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.574683 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.606181 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 00:08:47.724956 7508 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI0310 00:08:47.724964 7508 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0310 00:08:47.724977 7508 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF0310 00:08:47.724982 7508 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.627253 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.645557 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.664745 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: E0310 00:09:06.672477 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.683077 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.699547 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.714625 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.735665 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45a7a43f-ba45-4f54-92a9-6fae6144cd7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:28.788269 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:28.790599 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:28.823559 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:28.826417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:58.399257 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:58.399347 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.757341 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.777528 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.799331 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.816533 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d23bec6-0598-4970-8f25-e80867c2ad16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.848212 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.868631 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.883497 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.899158 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.915920 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.930821 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.553761 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.553867 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.553943 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.553944 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.555481 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.555651 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.556012 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.556116 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.680012 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.680101 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.680123 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.680150 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.680169 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:09:08Z","lastTransitionTime":"2026-03-10T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.702705 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.709365 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.709460 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.709489 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.709520 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.709545 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:09:08Z","lastTransitionTime":"2026-03-10T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.727918 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.734171 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.734336 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.734460 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.734573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.734668 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:09:08Z","lastTransitionTime":"2026-03-10T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.756479 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.761430 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.761496 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.761515 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.761542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.761562 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:09:08Z","lastTransitionTime":"2026-03-10T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.782752 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.788208 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.788264 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.788283 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.788312 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.788333 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:09:08Z","lastTransitionTime":"2026-03-10T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.808355 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.808582 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:09:10 crc kubenswrapper[4994]: I0310 00:09:10.553310 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:10 crc kubenswrapper[4994]: I0310 00:09:10.553396 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:10 crc kubenswrapper[4994]: I0310 00:09:10.553471 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:10 crc kubenswrapper[4994]: E0310 00:09:10.553679 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:10 crc kubenswrapper[4994]: I0310 00:09:10.553749 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:10 crc kubenswrapper[4994]: E0310 00:09:10.553853 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:10 crc kubenswrapper[4994]: E0310 00:09:10.554122 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:10 crc kubenswrapper[4994]: E0310 00:09:10.554168 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:11 crc kubenswrapper[4994]: E0310 00:09:11.674629 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:12 crc kubenswrapper[4994]: I0310 00:09:12.553434 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:12 crc kubenswrapper[4994]: I0310 00:09:12.553523 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:12 crc kubenswrapper[4994]: E0310 00:09:12.553618 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:12 crc kubenswrapper[4994]: I0310 00:09:12.553643 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:12 crc kubenswrapper[4994]: E0310 00:09:12.553788 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:12 crc kubenswrapper[4994]: I0310 00:09:12.553857 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:12 crc kubenswrapper[4994]: E0310 00:09:12.553983 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:12 crc kubenswrapper[4994]: E0310 00:09:12.554076 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:13 crc kubenswrapper[4994]: I0310 00:09:13.554490 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:09:13 crc kubenswrapper[4994]: E0310 00:09:13.554762 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:09:14 crc kubenswrapper[4994]: I0310 00:09:14.553088 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:14 crc kubenswrapper[4994]: I0310 00:09:14.553091 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:14 crc kubenswrapper[4994]: I0310 00:09:14.553224 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:14 crc kubenswrapper[4994]: I0310 00:09:14.553511 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:14 crc kubenswrapper[4994]: E0310 00:09:14.553688 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:14 crc kubenswrapper[4994]: E0310 00:09:14.554234 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:14 crc kubenswrapper[4994]: E0310 00:09:14.554534 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:14 crc kubenswrapper[4994]: E0310 00:09:14.554568 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.554168 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.554276 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.554739 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.554852 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:16 crc kubenswrapper[4994]: E0310 00:09:16.555059 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:16 crc kubenswrapper[4994]: E0310 00:09:16.555171 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:16 crc kubenswrapper[4994]: E0310 00:09:16.555249 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:16 crc kubenswrapper[4994]: E0310 00:09:16.555512 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.604433 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" podStartSLOduration=124.604399043 podStartE2EDuration="2m4.604399043s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.604046756 +0000 UTC m=+170.777753545" watchObservedRunningTime="2026-03-10 00:09:16.604399043 +0000 UTC m=+170.778105832" Mar 10 00:09:16 crc kubenswrapper[4994]: E0310 00:09:16.676029 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.681543 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-24l69" podStartSLOduration=125.681516187 podStartE2EDuration="2m5.681516187s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.648909754 +0000 UTC m=+170.822616543" watchObservedRunningTime="2026-03-10 00:09:16.681516187 +0000 UTC m=+170.855222976" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.709817 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" podStartSLOduration=125.709774476 podStartE2EDuration="2m5.709774476s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.682399576 +0000 UTC m=+170.856106365" watchObservedRunningTime="2026-03-10 00:09:16.709774476 +0000 UTC m=+170.883481275" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.836154 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jhp6z" podStartSLOduration=125.836118702 podStartE2EDuration="2m5.836118702s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.835917738 +0000 UTC m=+171.009624547" watchObservedRunningTime="2026-03-10 00:09:16.836118702 +0000 UTC m=+171.009825491" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.836783 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podStartSLOduration=125.836772677 podStartE2EDuration="2m5.836772677s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.819566386 +0000 UTC m=+170.993273175" watchObservedRunningTime="2026-03-10 00:09:16.836772677 +0000 UTC m=+171.010479466" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.850504 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=81.85047612299999 podStartE2EDuration="1m21.850476123s" podCreationTimestamp="2026-03-10 00:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.850307979 +0000 UTC m=+171.024014768" watchObservedRunningTime="2026-03-10 00:09:16.850476123 +0000 UTC m=+171.024182912" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.874984 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=32.87495716 podStartE2EDuration="32.87495716s" podCreationTimestamp="2026-03-10 00:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.874587993 +0000 UTC m=+171.048294742" watchObservedRunningTime="2026-03-10 00:09:16.87495716 +0000 UTC m=+171.048663939" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.920059 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mcxcb" podStartSLOduration=125.920035483 podStartE2EDuration="2m5.920035483s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.919807908 +0000 UTC m=+171.093514667" watchObservedRunningTime="2026-03-10 00:09:16.920035483 +0000 UTC m=+171.093742232" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.933108 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=36.933074955 podStartE2EDuration="36.933074955s" podCreationTimestamp="2026-03-10 00:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.932584184 +0000 UTC m=+171.106290943" watchObservedRunningTime="2026-03-10 00:09:16.933074955 +0000 UTC m=+171.106781714" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.957949 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=77.957921591 podStartE2EDuration="1m17.957921591s" podCreationTimestamp="2026-03-10 00:07:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.956164103 +0000 UTC m=+171.129870862" watchObservedRunningTime="2026-03-10 00:09:16.957921591 +0000 UTC m=+171.131628350" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.553505 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.553593 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.553596 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:18 crc kubenswrapper[4994]: E0310 00:09:18.553682 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.553719 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:18 crc kubenswrapper[4994]: E0310 00:09:18.554078 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:18 crc kubenswrapper[4994]: E0310 00:09:18.554113 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:18 crc kubenswrapper[4994]: E0310 00:09:18.554278 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.993374 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.993443 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.993463 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.993493 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.993522 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:09:18Z","lastTransitionTime":"2026-03-10T00:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.064053 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.064024256 podStartE2EDuration="1m28.064024256s" podCreationTimestamp="2026-03-10 00:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.972208969 +0000 UTC m=+171.145915728" watchObservedRunningTime="2026-03-10 00:09:19.064024256 +0000 UTC m=+173.237731045" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.067222 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c"] Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.068063 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.069906 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.070966 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.071859 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.072938 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.150744 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8f78cac8-8497-4562-a457-3650bda3763b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.150855 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f78cac8-8497-4562-a457-3650bda3763b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.150942 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f78cac8-8497-4562-a457-3650bda3763b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.151011 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8f78cac8-8497-4562-a457-3650bda3763b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.151071 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f78cac8-8497-4562-a457-3650bda3763b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.251617 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f78cac8-8497-4562-a457-3650bda3763b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.251931 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8f78cac8-8497-4562-a457-3650bda3763b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.251983 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f78cac8-8497-4562-a457-3650bda3763b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.252015 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f78cac8-8497-4562-a457-3650bda3763b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.252077 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8f78cac8-8497-4562-a457-3650bda3763b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.252174 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8f78cac8-8497-4562-a457-3650bda3763b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.252389 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8f78cac8-8497-4562-a457-3650bda3763b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.252796 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f78cac8-8497-4562-a457-3650bda3763b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.261690 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f78cac8-8497-4562-a457-3650bda3763b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.281270 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f78cac8-8497-4562-a457-3650bda3763b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.393035 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: W0310 00:09:19.411923 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f78cac8_8497_4562_a457_3650bda3763b.slice/crio-fcb58fc6c527be484e4db272d9983b77c5ceadd4b8ec45272617a375f879b216 WatchSource:0}: Error finding container fcb58fc6c527be484e4db272d9983b77c5ceadd4b8ec45272617a375f879b216: Status 404 returned error can't find the container with id fcb58fc6c527be484e4db272d9983b77c5ceadd4b8ec45272617a375f879b216 Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.466398 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" event={"ID":"8f78cac8-8497-4562-a457-3650bda3763b","Type":"ContainerStarted","Data":"fcb58fc6c527be484e4db272d9983b77c5ceadd4b8ec45272617a375f879b216"} Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.585801 4994 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.597539 4994 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 00:09:20 crc kubenswrapper[4994]: I0310 00:09:20.473206 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" event={"ID":"8f78cac8-8497-4562-a457-3650bda3763b","Type":"ContainerStarted","Data":"be745be1c0f181666631c98c9f488824f4fe59f9cfa59e72ee16f5e487265cca"} Mar 10 00:09:20 crc kubenswrapper[4994]: I0310 00:09:20.497305 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" podStartSLOduration=129.497280991 podStartE2EDuration="2m9.497280991s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:20.497111518 +0000 UTC m=+174.670818297" watchObservedRunningTime="2026-03-10 00:09:20.497280991 +0000 UTC m=+174.670987760" Mar 10 00:09:20 crc kubenswrapper[4994]: I0310 00:09:20.553958 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:20 crc kubenswrapper[4994]: I0310 00:09:20.554044 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:20 crc kubenswrapper[4994]: E0310 00:09:20.554144 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:20 crc kubenswrapper[4994]: I0310 00:09:20.553985 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:20 crc kubenswrapper[4994]: E0310 00:09:20.554300 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:20 crc kubenswrapper[4994]: E0310 00:09:20.554577 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:20 crc kubenswrapper[4994]: I0310 00:09:20.554793 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:20 crc kubenswrapper[4994]: E0310 00:09:20.554967 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:21 crc kubenswrapper[4994]: E0310 00:09:21.677670 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.483904 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/1.log" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.484617 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/0.log" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.484690 4994 generic.go:334] "Generic (PLEG): container finished" podID="6dac87a5-07eb-488d-85fe-cb8848434ae5" containerID="04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106" exitCode=1 Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.484733 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerDied","Data":"04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106"} Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.484781 4994 scope.go:117] "RemoveContainer" containerID="5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.485419 4994 scope.go:117] "RemoveContainer" containerID="04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106" Mar 10 00:09:22 crc kubenswrapper[4994]: E0310 00:09:22.485690 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-mcxcb_openshift-multus(6dac87a5-07eb-488d-85fe-cb8848434ae5)\"" pod="openshift-multus/multus-mcxcb" podUID="6dac87a5-07eb-488d-85fe-cb8848434ae5" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.579532 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.579605 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.579680 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:22 crc kubenswrapper[4994]: E0310 00:09:22.579697 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.579778 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:22 crc kubenswrapper[4994]: E0310 00:09:22.580107 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:22 crc kubenswrapper[4994]: E0310 00:09:22.580242 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:22 crc kubenswrapper[4994]: E0310 00:09:22.580337 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:23 crc kubenswrapper[4994]: I0310 00:09:23.492498 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/1.log" Mar 10 00:09:24 crc kubenswrapper[4994]: I0310 00:09:24.553730 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:24 crc kubenswrapper[4994]: I0310 00:09:24.553733 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:24 crc kubenswrapper[4994]: I0310 00:09:24.553806 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:24 crc kubenswrapper[4994]: I0310 00:09:24.554317 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:24 crc kubenswrapper[4994]: E0310 00:09:24.554506 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:24 crc kubenswrapper[4994]: E0310 00:09:24.554623 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:24 crc kubenswrapper[4994]: E0310 00:09:24.554750 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:24 crc kubenswrapper[4994]: I0310 00:09:24.555019 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:09:24 crc kubenswrapper[4994]: E0310 00:09:24.555030 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:24 crc kubenswrapper[4994]: E0310 00:09:24.555304 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:09:26 crc kubenswrapper[4994]: I0310 00:09:26.553076 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:26 crc kubenswrapper[4994]: I0310 00:09:26.553133 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:26 crc kubenswrapper[4994]: E0310 00:09:26.554501 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:26 crc kubenswrapper[4994]: I0310 00:09:26.554539 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:26 crc kubenswrapper[4994]: I0310 00:09:26.554618 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:26 crc kubenswrapper[4994]: E0310 00:09:26.554756 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:26 crc kubenswrapper[4994]: E0310 00:09:26.554937 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:26 crc kubenswrapper[4994]: E0310 00:09:26.555043 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:26 crc kubenswrapper[4994]: E0310 00:09:26.678522 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:28 crc kubenswrapper[4994]: I0310 00:09:28.553427 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:28 crc kubenswrapper[4994]: I0310 00:09:28.553481 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:28 crc kubenswrapper[4994]: I0310 00:09:28.553549 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:28 crc kubenswrapper[4994]: E0310 00:09:28.553659 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:28 crc kubenswrapper[4994]: I0310 00:09:28.553710 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:28 crc kubenswrapper[4994]: E0310 00:09:28.553854 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:28 crc kubenswrapper[4994]: E0310 00:09:28.554019 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:28 crc kubenswrapper[4994]: E0310 00:09:28.554196 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:30 crc kubenswrapper[4994]: I0310 00:09:30.553772 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:30 crc kubenswrapper[4994]: I0310 00:09:30.553835 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:30 crc kubenswrapper[4994]: E0310 00:09:30.553981 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:30 crc kubenswrapper[4994]: I0310 00:09:30.554040 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:30 crc kubenswrapper[4994]: I0310 00:09:30.554083 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:30 crc kubenswrapper[4994]: E0310 00:09:30.554210 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:30 crc kubenswrapper[4994]: E0310 00:09:30.554333 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:30 crc kubenswrapper[4994]: E0310 00:09:30.554424 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:31 crc kubenswrapper[4994]: E0310 00:09:31.680128 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:32 crc kubenswrapper[4994]: I0310 00:09:32.553547 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:32 crc kubenswrapper[4994]: E0310 00:09:32.553777 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:32 crc kubenswrapper[4994]: I0310 00:09:32.554142 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:32 crc kubenswrapper[4994]: I0310 00:09:32.554189 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:32 crc kubenswrapper[4994]: E0310 00:09:32.554315 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:32 crc kubenswrapper[4994]: I0310 00:09:32.554346 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:32 crc kubenswrapper[4994]: E0310 00:09:32.554467 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:32 crc kubenswrapper[4994]: E0310 00:09:32.554601 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:34 crc kubenswrapper[4994]: I0310 00:09:34.553136 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:34 crc kubenswrapper[4994]: I0310 00:09:34.553198 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:34 crc kubenswrapper[4994]: I0310 00:09:34.553301 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:34 crc kubenswrapper[4994]: E0310 00:09:34.553458 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:34 crc kubenswrapper[4994]: I0310 00:09:34.553492 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:34 crc kubenswrapper[4994]: E0310 00:09:34.553686 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:34 crc kubenswrapper[4994]: E0310 00:09:34.553722 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:34 crc kubenswrapper[4994]: E0310 00:09:34.553807 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:36 crc kubenswrapper[4994]: I0310 00:09:36.553330 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:36 crc kubenswrapper[4994]: I0310 00:09:36.553443 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:36 crc kubenswrapper[4994]: E0310 00:09:36.555359 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:36 crc kubenswrapper[4994]: I0310 00:09:36.555394 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:36 crc kubenswrapper[4994]: E0310 00:09:36.555532 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:36 crc kubenswrapper[4994]: I0310 00:09:36.555585 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:36 crc kubenswrapper[4994]: E0310 00:09:36.555669 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:36 crc kubenswrapper[4994]: E0310 00:09:36.555756 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:36 crc kubenswrapper[4994]: E0310 00:09:36.680695 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:37 crc kubenswrapper[4994]: I0310 00:09:37.554669 4994 scope.go:117] "RemoveContainer" containerID="04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106" Mar 10 00:09:37 crc kubenswrapper[4994]: I0310 00:09:37.555250 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.553778 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.553815 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:38 crc kubenswrapper[4994]: E0310 00:09:38.554529 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.553987 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.553812 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:38 crc kubenswrapper[4994]: E0310 00:09:38.554687 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:38 crc kubenswrapper[4994]: E0310 00:09:38.554979 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:38 crc kubenswrapper[4994]: E0310 00:09:38.555039 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.564448 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/1.log" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.564613 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerStarted","Data":"d5d956a023e0ec491c4113a78f5143267fd1a96d627be08eb78b078d22c69f89"} Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.569080 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/3.log" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.573426 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.575088 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.587511 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vxjt2"] Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.587714 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:38 crc kubenswrapper[4994]: E0310 00:09:38.587990 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.638953 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podStartSLOduration=147.638933044 podStartE2EDuration="2m27.638933044s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:38.638329131 +0000 UTC m=+192.812035890" watchObservedRunningTime="2026-03-10 00:09:38.638933044 +0000 UTC m=+192.812639803" Mar 10 00:09:40 crc kubenswrapper[4994]: I0310 00:09:40.553574 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:40 crc kubenswrapper[4994]: E0310 00:09:40.554205 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:40 crc kubenswrapper[4994]: I0310 00:09:40.553574 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:40 crc kubenswrapper[4994]: I0310 00:09:40.553645 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:40 crc kubenswrapper[4994]: I0310 00:09:40.553632 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:40 crc kubenswrapper[4994]: E0310 00:09:40.554532 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:40 crc kubenswrapper[4994]: E0310 00:09:40.554393 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:40 crc kubenswrapper[4994]: E0310 00:09:40.554804 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.553989 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.554003 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.554216 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.554331 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.558430 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.558624 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.558779 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.558805 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.560324 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.560800 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 00:09:48 crc kubenswrapper[4994]: I0310 00:09:48.924008 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.401414 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.454924 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.456848 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.460845 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xm7j6"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.464643 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.469056 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.469093 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.469508 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.469199 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.469680 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.469200 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.473040 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.473387 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-725jp"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.474082 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.475810 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxpkq"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.476343 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.477696 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.477911 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pwpc6"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.482991 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lxxqb"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.483520 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m6jnx"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.483689 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: W0310 00:09:49.484258 4994 reflector.go:561] object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj": failed to list *v1.Secret: secrets "authentication-operator-dockercfg-mz9bj" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Mar 10 00:09:49 crc kubenswrapper[4994]: E0310 00:09:49.484311 4994 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-mz9bj\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"authentication-operator-dockercfg-mz9bj\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.484495 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.484953 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485076 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485267 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mzg85"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485385 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485626 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485717 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485863 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.486064 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.486150 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485687 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.486324 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.486785 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.486924 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.487554 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488038 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488095 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488270 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488386 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.489258 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.489494 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488495 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488779 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488629 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.491795 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.492000 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488775 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.492347 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.500809 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.505567 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.506026 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.506205 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.506341 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.506570 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.506664 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.507132 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.508510 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.509261 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.516832 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.517358 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.517838 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.518216 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.518409 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.518618 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.518983 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.520204 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.520496 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.520650 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.520832 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.521255 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.535118 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxpkq"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.535173 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.536203 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.546642 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rlqtz"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.547191 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.548431 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.548661 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.548839 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.567406 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.567663 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.568434 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.568627 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.570918 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.572270 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.572833 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8lrmb"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.573185 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hqlnc"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.573514 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29551680-sz8pz"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.573978 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.574045 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.574394 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.574677 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.574702 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.574947 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575184 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.576213 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.574948 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.583843 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584201 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-etcd-serving-ca\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584251 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-serving-cert\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584281 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-config\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584309 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06c3fca2-64b6-47e2-885f-948eac331c10-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584337 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwggf\" (UniqueName: \"kubernetes.io/projected/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-kube-api-access-jwggf\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584367 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5937dfbb-0da7-439c-94cb-e0e1f658d464-audit-dir\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584414 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thbrs\" (UniqueName: \"kubernetes.io/projected/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-kube-api-access-thbrs\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584444 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584472 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584505 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-encryption-config\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584531 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/903778b5-0c60-42d6-8773-a1345817fe1f-audit-dir\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584560 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584592 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584620 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs6gr\" (UniqueName: \"kubernetes.io/projected/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-kube-api-access-rs6gr\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584644 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-75h8c"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575001 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584730 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575057 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.585147 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575178 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.585274 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-x6s5d"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575334 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575409 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.585603 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.585671 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575468 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575559 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586145 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586180 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586304 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575585 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575708 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.576960 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582084 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582485 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582535 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582583 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582732 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582843 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582988 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586381 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.587066 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.587566 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.587958 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.588195 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.588378 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.588616 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.588753 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584650 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-config\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.598698 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l78l\" (UniqueName: \"kubernetes.io/projected/9a1c67e3-f6df-4b4d-b3a3-669503580446-kube-api-access-9l78l\") pod \"cluster-samples-operator-665b6dd947-87hn7\" (UID: \"9a1c67e3-f6df-4b4d-b3a3-669503580446\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.598755 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.598792 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed61f01-8d13-4883-ac58-0e998df5c20d-config\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.598815 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-serving-cert\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.598845 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fl9d\" (UniqueName: \"kubernetes.io/projected/3a5ced5c-b690-4a1c-8d48-bbf789366816-kube-api-access-4fl9d\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.598870 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s798j\" (UniqueName: \"kubernetes.io/projected/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-kube-api-access-s798j\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.600834 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586655 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.600915 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.600945 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-config\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601036 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-encryption-config\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601062 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a1c67e3-f6df-4b4d-b3a3-669503580446-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-87hn7\" (UID: \"9a1c67e3-f6df-4b4d-b3a3-669503580446\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601104 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601127 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-config\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601148 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-service-ca-bundle\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601171 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5ced5c-b690-4a1c-8d48-bbf789366816-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601209 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxm5s\" (UniqueName: \"kubernetes.io/projected/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-kube-api-access-rxm5s\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601257 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-config\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586683 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.588940 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.593960 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586932 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.589227 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.592398 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.593480 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.603499 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594086 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594243 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594389 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594434 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.603794 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594557 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594599 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594638 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.597613 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.602056 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608384 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmfx4\" (UniqueName: \"kubernetes.io/projected/1109e060-ef32-407d-8283-eba65e1d4eaa-kube-api-access-wmfx4\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608419 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608465 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-audit-dir\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608512 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d8f6\" (UniqueName: \"kubernetes.io/projected/6ed61f01-8d13-4883-ac58-0e998df5c20d-kube-api-access-5d8f6\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608539 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-serving-cert\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608569 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-client-ca\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608613 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608689 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-audit-policies\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608717 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-serving-cert\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608744 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608773 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-serving-cert\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608808 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608835 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608838 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-audit-policies\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608949 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609127 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-etcd-client\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609194 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ed61f01-8d13-4883-ac58-0e998df5c20d-trusted-ca\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609225 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpvwk\" (UniqueName: \"kubernetes.io/projected/06c3fca2-64b6-47e2-885f-948eac331c10-kube-api-access-lpvwk\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609251 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609265 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-client-ca\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609296 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1109e060-ef32-407d-8283-eba65e1d4eaa-machine-approver-tls\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.602454 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609350 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06c3fca2-64b6-47e2-885f-948eac331c10-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609383 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609514 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609527 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609662 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48dxj\" (UniqueName: \"kubernetes.io/projected/903778b5-0c60-42d6-8773-a1345817fe1f-kube-api-access-48dxj\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609693 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1109e060-ef32-407d-8283-eba65e1d4eaa-config\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609719 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-node-pullsecrets\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609747 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-etcd-client\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609754 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609799 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-image-import-ca\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609837 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609985 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610025 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npjwm\" (UniqueName: \"kubernetes.io/projected/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-kube-api-access-npjwm\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610075 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2qtg\" (UniqueName: \"kubernetes.io/projected/5937dfbb-0da7-439c-94cb-e0e1f658d464-kube-api-access-w2qtg\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610160 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610304 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610383 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-audit\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610457 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-serving-cert\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610540 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.614221 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610628 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5ced5c-b690-4a1c-8d48-bbf789366816-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.615576 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1109e060-ef32-407d-8283-eba65e1d4eaa-auth-proxy-config\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.615643 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ed61f01-8d13-4883-ac58-0e998df5c20d-serving-cert\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.615680 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-images\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.616352 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.620121 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.620836 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.621136 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.621344 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.624332 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6sx4w"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.624776 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.628191 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.629029 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pvfj5"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.634041 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.634500 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vh5ns"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.635336 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.635964 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgf68"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.636738 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.637584 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.638508 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.639276 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.639862 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.641120 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-966nr"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.642060 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.642740 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551688-9zsf6"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.643649 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.643862 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.644410 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.645959 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.646294 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.646319 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.646693 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.647401 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m6jnx"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.647424 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pwpc6"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.647436 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wv4d4"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.647910 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.649705 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.649746 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-725jp"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.649840 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.651702 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xm7j6"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.653083 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.654135 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.655228 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.656345 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.657482 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.660372 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.661039 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29551680-sz8pz"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.662123 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8lrmb"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.670501 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.672133 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pvfj5"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.672661 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.679375 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.679809 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.681018 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rlqtz"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.682097 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.683750 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.684745 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lxxqb"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.685791 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mzg85"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.687115 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hqlnc"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.689700 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.690598 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.692521 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wv4d4"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.694030 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-75h8c"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.695043 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.696027 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5hvbc"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.696943 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.697422 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.698501 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.699471 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.699931 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551688-9zsf6"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.702268 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-966nr"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.707785 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vh5ns"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.711507 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.716092 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgf68"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717051 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717108 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-encryption-config\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717139 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbc1f03-d386-460b-81f5-e6b7d3630557-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717173 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/903778b5-0c60-42d6-8773-a1345817fe1f-audit-dir\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717194 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717217 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717235 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs6gr\" (UniqueName: \"kubernetes.io/projected/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-kube-api-access-rs6gr\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717304 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-config\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717324 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l78l\" (UniqueName: \"kubernetes.io/projected/9a1c67e3-f6df-4b4d-b3a3-669503580446-kube-api-access-9l78l\") pod \"cluster-samples-operator-665b6dd947-87hn7\" (UID: \"9a1c67e3-f6df-4b4d-b3a3-669503580446\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717344 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qdfc\" (UniqueName: \"kubernetes.io/projected/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-kube-api-access-6qdfc\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717365 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/018c45cc-8cfa-497b-b6cf-25b10c694c58-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717382 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2204937d-9632-46e6-8f26-0cea8593d1a5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cvds8\" (UID: \"2204937d-9632-46e6-8f26-0cea8593d1a5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717401 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717418 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed61f01-8d13-4883-ac58-0e998df5c20d-config\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717435 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-serving-cert\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717452 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fl9d\" (UniqueName: \"kubernetes.io/projected/3a5ced5c-b690-4a1c-8d48-bbf789366816-kube-api-access-4fl9d\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717474 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-config\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717494 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s798j\" (UniqueName: \"kubernetes.io/projected/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-kube-api-access-s798j\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717511 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717537 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717555 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-encryption-config\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717574 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-service-ca-bundle\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717621 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a1c67e3-f6df-4b4d-b3a3-669503580446-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-87hn7\" (UID: \"9a1c67e3-f6df-4b4d-b3a3-669503580446\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717653 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018c45cc-8cfa-497b-b6cf-25b10c694c58-metrics-tls\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717689 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717710 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-config\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717736 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dbc1f03-d386-460b-81f5-e6b7d3630557-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717758 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxm5s\" (UniqueName: \"kubernetes.io/projected/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-kube-api-access-rxm5s\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717776 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5ced5c-b690-4a1c-8d48-bbf789366816-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717795 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvvnx\" (UniqueName: \"kubernetes.io/projected/11b78073-cc4a-4a6f-89ab-631fde4b3371-kube-api-access-gvvnx\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717818 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29f559d0-b505-4855-91e3-e46804b0c9f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717838 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-config\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717858 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmfx4\" (UniqueName: \"kubernetes.io/projected/1109e060-ef32-407d-8283-eba65e1d4eaa-kube-api-access-wmfx4\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717877 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-serving-cert\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717912 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-audit-dir\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717930 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7jt5\" (UniqueName: \"kubernetes.io/projected/018c45cc-8cfa-497b-b6cf-25b10c694c58-kube-api-access-b7jt5\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717954 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d8f6\" (UniqueName: \"kubernetes.io/projected/6ed61f01-8d13-4883-ac58-0e998df5c20d-kube-api-access-5d8f6\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717973 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717993 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718011 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-client-ca\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718032 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-webhook-cert\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718054 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/018c45cc-8cfa-497b-b6cf-25b10c694c58-trusted-ca\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718072 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-serving-cert\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718091 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq5dn\" (UniqueName: \"kubernetes.io/projected/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-kube-api-access-hq5dn\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718107 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718125 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-audit-policies\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718144 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-etcd-client\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718164 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718181 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-serving-cert\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718201 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718218 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-audit-policies\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718235 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-trusted-ca-bundle\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718240 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-config\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718256 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6snqr\" (UniqueName: \"kubernetes.io/projected/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-kube-api-access-6snqr\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718276 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718295 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ed61f01-8d13-4883-ac58-0e998df5c20d-trusted-ca\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718321 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpvwk\" (UniqueName: \"kubernetes.io/projected/06c3fca2-64b6-47e2-885f-948eac331c10-kube-api-access-lpvwk\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718347 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7278l\" (UniqueName: \"kubernetes.io/projected/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-kube-api-access-7278l\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718371 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-client-ca\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718390 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718417 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718433 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1109e060-ef32-407d-8283-eba65e1d4eaa-machine-approver-tls\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718450 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718468 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06c3fca2-64b6-47e2-885f-948eac331c10-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718488 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-config\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718506 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-service-ca\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718528 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718545 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48dxj\" (UniqueName: \"kubernetes.io/projected/903778b5-0c60-42d6-8773-a1345817fe1f-kube-api-access-48dxj\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718563 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1109e060-ef32-407d-8283-eba65e1d4eaa-config\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718587 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-node-pullsecrets\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718645 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-etcd-client\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718676 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-image-import-ca\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718710 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718718 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718737 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718769 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npjwm\" (UniqueName: \"kubernetes.io/projected/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-kube-api-access-npjwm\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718792 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1377f73a-df08-4450-afa1-960e15891141-config-volume\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718814 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxzm2\" (UniqueName: \"kubernetes.io/projected/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-kube-api-access-qxzm2\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718836 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-srv-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718861 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2qtg\" (UniqueName: \"kubernetes.io/projected/5937dfbb-0da7-439c-94cb-e0e1f658d464-kube-api-access-w2qtg\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718943 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-serving-cert\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718967 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718990 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719012 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-audit\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719040 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-serving-cert\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719063 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-oauth-serving-cert\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719087 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgr88\" (UniqueName: \"kubernetes.io/projected/1377f73a-df08-4450-afa1-960e15891141-kube-api-access-dgr88\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719115 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719142 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5ced5c-b690-4a1c-8d48-bbf789366816-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719165 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1109e060-ef32-407d-8283-eba65e1d4eaa-auth-proxy-config\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719190 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhnvv\" (UniqueName: \"kubernetes.io/projected/3ffed56d-e2ab-4fa9-9dac-98c382395f2f-kube-api-access-lhnvv\") pod \"migrator-59844c95c7-fkpf5\" (UID: \"3ffed56d-e2ab-4fa9-9dac-98c382395f2f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719221 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f559d0-b505-4855-91e3-e46804b0c9f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719248 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ed61f01-8d13-4883-ac58-0e998df5c20d-serving-cert\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719271 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-images\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719293 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-tmpfs\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719320 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-config\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719347 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-etcd-serving-ca\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719370 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1377f73a-df08-4450-afa1-960e15891141-metrics-tls\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719395 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-serving-cert\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719419 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06c3fca2-64b6-47e2-885f-948eac331c10-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719443 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pv9b\" (UniqueName: \"kubernetes.io/projected/49f58ba5-3573-4894-a320-fcf4ca4e50f1-kube-api-access-4pv9b\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719465 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719498 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thbrs\" (UniqueName: \"kubernetes.io/projected/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-kube-api-access-thbrs\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719531 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwggf\" (UniqueName: \"kubernetes.io/projected/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-kube-api-access-jwggf\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719554 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5937dfbb-0da7-439c-94cb-e0e1f658d464-audit-dir\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719590 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dbc1f03-d386-460b-81f5-e6b7d3630557-config\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719613 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49f58ba5-3573-4894-a320-fcf4ca4e50f1-srv-cert\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719642 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49f58ba5-3573-4894-a320-fcf4ca4e50f1-profile-collector-cert\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719676 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv5f5\" (UniqueName: \"kubernetes.io/projected/2204937d-9632-46e6-8f26-0cea8593d1a5-kube-api-access-mv5f5\") pod \"package-server-manager-789f6589d5-cvds8\" (UID: \"2204937d-9632-46e6-8f26-0cea8593d1a5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719709 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719734 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719762 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-proxy-tls\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719767 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6sx4w"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719786 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-oauth-config\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719903 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29f559d0-b505-4855-91e3-e46804b0c9f1-config\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.721163 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717700 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/903778b5-0c60-42d6-8773-a1345817fe1f-audit-dir\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.721655 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.721849 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.722422 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed61f01-8d13-4883-ac58-0e998df5c20d-config\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.722498 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1109e060-ef32-407d-8283-eba65e1d4eaa-config\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.722584 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-node-pullsecrets\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.722924 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ed61f01-8d13-4883-ac58-0e998df5c20d-trusted-ca\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.724264 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-client-ca\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.727518 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-etcd-client\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.728601 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5937dfbb-0da7-439c-94cb-e0e1f658d464-audit-dir\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.727839 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06c3fca2-64b6-47e2-885f-948eac331c10-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.727723 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5hvbc"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.729125 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-audit-dir\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.730236 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-serving-cert\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.730337 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5ced5c-b690-4a1c-8d48-bbf789366816-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.730785 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06c3fca2-64b6-47e2-885f-948eac331c10-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.731061 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-config\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.731219 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.731969 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1109e060-ef32-407d-8283-eba65e1d4eaa-machine-approver-tls\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.733519 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.733818 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.733958 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-etcd-serving-ca\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.734169 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.734273 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.734274 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-client-ca\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.734458 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1109e060-ef32-407d-8283-eba65e1d4eaa-auth-proxy-config\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735263 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-config\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735480 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-encryption-config\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735566 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-config\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735603 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-audit-policies\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735639 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-audit-policies\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735638 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-service-ca-bundle\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735647 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.736065 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5ced5c-b690-4a1c-8d48-bbf789366816-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.736248 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.736375 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-etcd-client\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.736554 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-images\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.736778 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-image-import-ca\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.737127 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-serving-cert\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.737202 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.737125 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-audit\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.737317 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.737963 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-config\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.738433 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-serving-cert\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.739454 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.739912 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.740570 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a1c67e3-f6df-4b4d-b3a3-669503580446-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-87hn7\" (UID: \"9a1c67e3-f6df-4b4d-b3a3-669503580446\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.743511 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.743686 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.743890 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-serving-cert\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.744015 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ed61f01-8d13-4883-ac58-0e998df5c20d-serving-cert\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.744077 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-serving-cert\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.744922 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.747063 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cf2xx"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.747691 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.747990 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-serving-cert\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.748161 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-encryption-config\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.749465 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.751002 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.752917 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.755185 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-47fkz"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.756169 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cf2xx"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.756279 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.761849 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.779420 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.800202 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.820209 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821186 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-oauth-config\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821261 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29f559d0-b505-4855-91e3-e46804b0c9f1-config\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821324 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbc1f03-d386-460b-81f5-e6b7d3630557-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821413 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qdfc\" (UniqueName: \"kubernetes.io/projected/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-kube-api-access-6qdfc\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821480 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821511 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/018c45cc-8cfa-497b-b6cf-25b10c694c58-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821582 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2204937d-9632-46e6-8f26-0cea8593d1a5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cvds8\" (UID: \"2204937d-9632-46e6-8f26-0cea8593d1a5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821660 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018c45cc-8cfa-497b-b6cf-25b10c694c58-metrics-tls\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821698 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dbc1f03-d386-460b-81f5-e6b7d3630557-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821744 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvvnx\" (UniqueName: \"kubernetes.io/projected/11b78073-cc4a-4a6f-89ab-631fde4b3371-kube-api-access-gvvnx\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821773 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29f559d0-b505-4855-91e3-e46804b0c9f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821828 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7jt5\" (UniqueName: \"kubernetes.io/projected/018c45cc-8cfa-497b-b6cf-25b10c694c58-kube-api-access-b7jt5\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821945 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822004 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-webhook-cert\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822026 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/018c45cc-8cfa-497b-b6cf-25b10c694c58-trusted-ca\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822087 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq5dn\" (UniqueName: \"kubernetes.io/projected/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-kube-api-access-hq5dn\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822134 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-trusted-ca-bundle\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822163 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6snqr\" (UniqueName: \"kubernetes.io/projected/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-kube-api-access-6snqr\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822209 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822252 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7278l\" (UniqueName: \"kubernetes.io/projected/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-kube-api-access-7278l\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822350 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822412 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822442 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-config\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822492 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-service-ca\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822549 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1377f73a-df08-4450-afa1-960e15891141-config-volume\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822582 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxzm2\" (UniqueName: \"kubernetes.io/projected/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-kube-api-access-qxzm2\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822609 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-srv-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822635 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhnvv\" (UniqueName: \"kubernetes.io/projected/3ffed56d-e2ab-4fa9-9dac-98c382395f2f-kube-api-access-lhnvv\") pod \"migrator-59844c95c7-fkpf5\" (UID: \"3ffed56d-e2ab-4fa9-9dac-98c382395f2f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822656 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-serving-cert\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822680 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-oauth-serving-cert\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822703 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgr88\" (UniqueName: \"kubernetes.io/projected/1377f73a-df08-4450-afa1-960e15891141-kube-api-access-dgr88\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822729 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-tmpfs\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822758 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f559d0-b505-4855-91e3-e46804b0c9f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822784 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1377f73a-df08-4450-afa1-960e15891141-metrics-tls\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822811 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pv9b\" (UniqueName: \"kubernetes.io/projected/49f58ba5-3573-4894-a320-fcf4ca4e50f1-kube-api-access-4pv9b\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822831 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822924 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dbc1f03-d386-460b-81f5-e6b7d3630557-config\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822950 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49f58ba5-3573-4894-a320-fcf4ca4e50f1-srv-cert\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822975 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49f58ba5-3573-4894-a320-fcf4ca4e50f1-profile-collector-cert\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822999 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv5f5\" (UniqueName: \"kubernetes.io/projected/2204937d-9632-46e6-8f26-0cea8593d1a5-kube-api-access-mv5f5\") pod \"package-server-manager-789f6589d5-cvds8\" (UID: \"2204937d-9632-46e6-8f26-0cea8593d1a5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.823032 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-proxy-tls\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.823563 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-tmpfs\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.823625 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-trusted-ca-bundle\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.823994 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.824277 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-config\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.824739 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-oauth-serving-cert\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.825419 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-service-ca\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.826568 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-oauth-config\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.826594 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.827651 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-webhook-cert\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.829100 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-serving-cert\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.839708 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.849196 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018c45cc-8cfa-497b-b6cf-25b10c694c58-metrics-tls\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.860060 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.887007 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.893664 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/018c45cc-8cfa-497b-b6cf-25b10c694c58-trusted-ca\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.899322 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.918906 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.940090 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.958966 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.979041 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.999840 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.020392 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.039877 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.059505 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.079639 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.099170 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.118761 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.140666 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.160757 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.179694 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.200054 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.220479 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.240221 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.247263 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbc1f03-d386-460b-81f5-e6b7d3630557-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.260037 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.265750 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dbc1f03-d386-460b-81f5-e6b7d3630557-config\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.279547 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.288194 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2204937d-9632-46e6-8f26-0cea8593d1a5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cvds8\" (UID: \"2204937d-9632-46e6-8f26-0cea8593d1a5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.299453 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.320654 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.340726 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.360161 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.380785 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.400035 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.408614 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.410285 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49f58ba5-3573-4894-a320-fcf4ca4e50f1-profile-collector-cert\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.439856 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.461707 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.480604 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.500086 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.521186 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.529150 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29f559d0-b505-4855-91e3-e46804b0c9f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.540254 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.542643 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29f559d0-b505-4855-91e3-e46804b0c9f1-config\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.559061 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.569193 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49f58ba5-3573-4894-a320-fcf4ca4e50f1-srv-cert\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.580069 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.600609 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.620032 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.637522 4994 request.go:700] Waited for 1.012205583s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.640919 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.660594 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.681228 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.699505 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.720011 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.742649 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.759077 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.779583 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.799822 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.818939 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.822486 4994 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.822558 4994 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.822610 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics podName:b85bbdaa-daa8-4c69-abf9-9f1200eb07cd nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.322579533 +0000 UTC m=+205.496286322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics") pod "marketplace-operator-79b997595-tgf68" (UID: "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd") : failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.822638 4994 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.822640 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca podName:b85bbdaa-daa8-4c69-abf9-9f1200eb07cd nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.322627634 +0000 UTC m=+205.496334413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca") pod "marketplace-operator-79b997595-tgf68" (UID: "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd") : failed to sync configmap cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.822707 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-control-plane-machine-set-operator-tls podName:f15954a6-2036-4c32-a8b6-bc8e227d0fcd nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.322685405 +0000 UTC m=+205.496392184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-vjj5j" (UID: "f15954a6-2036-4c32-a8b6-bc8e227d0fcd") : failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823674 4994 secret.go:188] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823741 4994 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823776 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-proxy-tls podName:2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.323751902 +0000 UTC m=+205.497458691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-proxy-tls") pod "machine-config-controller-84d6567774-966nr" (UID: "2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a") : failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823821 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-srv-cert podName:5cc67063-d02f-4cb9-a15d-0d0a5c457e6e nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.323797733 +0000 UTC m=+205.497504522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-srv-cert") pod "olm-operator-6b444d44fb-c4cdv" (UID: "5cc67063-d02f-4cb9-a15d-0d0a5c457e6e") : failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823843 4994 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823920 4994 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823967 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1377f73a-df08-4450-afa1-960e15891141-metrics-tls podName:1377f73a-df08-4450-afa1-960e15891141 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.323942286 +0000 UTC m=+205.497649075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1377f73a-df08-4450-afa1-960e15891141-metrics-tls") pod "dns-default-wv4d4" (UID: "1377f73a-df08-4450-afa1-960e15891141") : failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.824086 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1377f73a-df08-4450-afa1-960e15891141-config-volume podName:1377f73a-df08-4450-afa1-960e15891141 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.324046939 +0000 UTC m=+205.497753908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/1377f73a-df08-4450-afa1-960e15891141-config-volume") pod "dns-default-wv4d4" (UID: "1377f73a-df08-4450-afa1-960e15891141") : failed to sync configmap cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.840601 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.860567 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.880077 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.900972 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.921726 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.951946 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.960174 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.980430 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.000427 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.019844 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.039806 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.059921 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.079231 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.100722 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.120938 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.141740 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.159339 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.179432 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.201201 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.220634 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.240822 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.261705 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.279737 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.300955 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.321004 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.339086 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.358700 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1377f73a-df08-4450-afa1-960e15891141-config-volume\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.358788 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-srv-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.358878 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1377f73a-df08-4450-afa1-960e15891141-metrics-tls\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.359007 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-proxy-tls\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.359078 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.359248 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.359363 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.360318 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.362144 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.364738 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1377f73a-df08-4450-afa1-960e15891141-metrics-tls\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.365414 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-srv-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.367511 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-proxy-tls\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.368938 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.370366 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.383780 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.391300 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1377f73a-df08-4450-afa1-960e15891141-config-volume\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.420422 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.439829 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.460690 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.479836 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.526906 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs6gr\" (UniqueName: \"kubernetes.io/projected/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-kube-api-access-rs6gr\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.553552 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l78l\" (UniqueName: \"kubernetes.io/projected/9a1c67e3-f6df-4b4d-b3a3-669503580446-kube-api-access-9l78l\") pod \"cluster-samples-operator-665b6dd947-87hn7\" (UID: \"9a1c67e3-f6df-4b4d-b3a3-669503580446\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.571557 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48dxj\" (UniqueName: \"kubernetes.io/projected/903778b5-0c60-42d6-8773-a1345817fe1f-kube-api-access-48dxj\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.588394 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpvwk\" (UniqueName: \"kubernetes.io/projected/06c3fca2-64b6-47e2-885f-948eac331c10-kube-api-access-lpvwk\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.607902 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxm5s\" (UniqueName: \"kubernetes.io/projected/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-kube-api-access-rxm5s\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.627537 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fl9d\" (UniqueName: \"kubernetes.io/projected/3a5ced5c-b690-4a1c-8d48-bbf789366816-kube-api-access-4fl9d\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.638272 4994 request.go:700] Waited for 1.909735757s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.646325 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmfx4\" (UniqueName: \"kubernetes.io/projected/1109e060-ef32-407d-8283-eba65e1d4eaa-kube-api-access-wmfx4\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.650093 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.663605 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s798j\" (UniqueName: \"kubernetes.io/projected/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-kube-api-access-s798j\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.681937 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thbrs\" (UniqueName: \"kubernetes.io/projected/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-kube-api-access-thbrs\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.701289 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwggf\" (UniqueName: \"kubernetes.io/projected/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-kube-api-access-jwggf\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.719359 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d8f6\" (UniqueName: \"kubernetes.io/projected/6ed61f01-8d13-4883-ac58-0e998df5c20d-kube-api-access-5d8f6\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.729059 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.739479 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npjwm\" (UniqueName: \"kubernetes.io/projected/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-kube-api-access-npjwm\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.746061 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.756150 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.760827 4994 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.767387 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.769855 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2qtg\" (UniqueName: \"kubernetes.io/projected/5937dfbb-0da7-439c-94cb-e0e1f658d464-kube-api-access-w2qtg\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.777650 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.779699 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.785370 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.793967 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.799863 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.800366 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.813819 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.822073 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.840077 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.859518 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.878757 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.895745 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxpkq"] Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.900522 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dbc1f03-d386-460b-81f5-e6b7d3630557-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.909579 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.925817 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qdfc\" (UniqueName: \"kubernetes.io/projected/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-kube-api-access-6qdfc\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.958700 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvvnx\" (UniqueName: \"kubernetes.io/projected/11b78073-cc4a-4a6f-89ab-631fde4b3371-kube-api-access-gvvnx\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.967637 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.970718 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7jt5\" (UniqueName: \"kubernetes.io/projected/018c45cc-8cfa-497b-b6cf-25b10c694c58-kube-api-access-b7jt5\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.977780 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6snqr\" (UniqueName: \"kubernetes.io/projected/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-kube-api-access-6snqr\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.990981 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pwpc6"] Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.993904 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq5dn\" (UniqueName: \"kubernetes.io/projected/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-kube-api-access-hq5dn\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:52 crc kubenswrapper[4994]: W0310 00:09:52.013086 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ed61f01_8d13_4883_ac58_0e998df5c20d.slice/crio-41d7b08bd6160997086ea083f385225cccd4bb088f59fd6fb1f00cf222b305b5 WatchSource:0}: Error finding container 41d7b08bd6160997086ea083f385225cccd4bb088f59fd6fb1f00cf222b305b5: Status 404 returned error can't find the container with id 41d7b08bd6160997086ea083f385225cccd4bb088f59fd6fb1f00cf222b305b5 Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.016225 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7278l\" (UniqueName: \"kubernetes.io/projected/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-kube-api-access-7278l\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.042754 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgr88\" (UniqueName: \"kubernetes.io/projected/1377f73a-df08-4450-afa1-960e15891141-kube-api-access-dgr88\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.059633 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxzm2\" (UniqueName: \"kubernetes.io/projected/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-kube-api-access-qxzm2\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.059970 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.071533 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.075776 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f559d0-b505-4855-91e3-e46804b0c9f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.095840 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhnvv\" (UniqueName: \"kubernetes.io/projected/3ffed56d-e2ab-4fa9-9dac-98c382395f2f-kube-api-access-lhnvv\") pod \"migrator-59844c95c7-fkpf5\" (UID: \"3ffed56d-e2ab-4fa9-9dac-98c382395f2f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.120308 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.122589 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pv9b\" (UniqueName: \"kubernetes.io/projected/49f58ba5-3573-4894-a320-fcf4ca4e50f1-kube-api-access-4pv9b\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.135959 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv5f5\" (UniqueName: \"kubernetes.io/projected/2204937d-9632-46e6-8f26-0cea8593d1a5-kube-api-access-mv5f5\") pod \"package-server-manager-789f6589d5-cvds8\" (UID: \"2204937d-9632-46e6-8f26-0cea8593d1a5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.152679 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.162324 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.162772 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/018c45cc-8cfa-497b-b6cf-25b10c694c58-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.182721 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.183709 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.194526 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.213663 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.214477 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.223938 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.244839 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lxxqb"] Mar 10 00:09:52 crc kubenswrapper[4994]: W0310 00:09:52.269471 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5937dfbb_0da7_439c_94cb_e0e1f658d464.slice/crio-8831fb3579ee9cdb923ec4d8c1aef2294468f33bab907bd5867001319278e8ad WatchSource:0}: Error finding container 8831fb3579ee9cdb923ec4d8c1aef2294468f33bab907bd5867001319278e8ad: Status 404 returned error can't find the container with id 8831fb3579ee9cdb923ec4d8c1aef2294468f33bab907bd5867001319278e8ad Mar 10 00:09:52 crc kubenswrapper[4994]: W0310 00:09:52.271336 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5d8ed1_6cb6_4ed6_b0b4_7a2c795dbb21.slice/crio-47cdd0fc5df08b42eefeda9e713ce29bf75c312c7780a55dff7b457658f9ecb2 WatchSource:0}: Error finding container 47cdd0fc5df08b42eefeda9e713ce29bf75c312c7780a55dff7b457658f9ecb2: Status 404 returned error can't find the container with id 47cdd0fc5df08b42eefeda9e713ce29bf75c312c7780a55dff7b457658f9ecb2 Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.274595 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spglq\" (UniqueName: \"kubernetes.io/projected/10eead56-2e9b-4d48-ab81-d1638b3cdddc-kube-api-access-spglq\") pod \"multus-admission-controller-857f4d67dd-vh5ns\" (UID: \"10eead56-2e9b-4d48-ab81-d1638b3cdddc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.274710 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-signing-key\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.274784 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-config\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.274851 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4fvw\" (UniqueName: \"kubernetes.io/projected/a1456dd8-5038-4bcc-8f19-51325ac84c02-kube-api-access-m4fvw\") pod \"auto-csr-approver-29551688-9zsf6\" (UID: \"a1456dd8-5038-4bcc-8f19-51325ac84c02\") " pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.274980 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-metrics-certs\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275043 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28b5630a-9f96-453c-ac88-70d75b7d438d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275090 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngwjf\" (UniqueName: \"kubernetes.io/projected/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-kube-api-access-ngwjf\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275117 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/964eb9a7-a580-44d4-b5e5-fe84d085823c-metrics-tls\") pod \"dns-operator-744455d44c-6sx4w\" (UID: \"964eb9a7-a580-44d4-b5e5-fe84d085823c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275152 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r68q\" (UniqueName: \"kubernetes.io/projected/1e80a388-91b3-42f1-9ee2-70ab4850652d-kube-api-access-9r68q\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275179 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e80a388-91b3-42f1-9ee2-70ab4850652d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275211 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbrkc\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-kube-api-access-kbrkc\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275251 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e80a388-91b3-42f1-9ee2-70ab4850652d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275279 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-trusted-ca\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275321 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-registry-certificates\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275353 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkgzm\" (UniqueName: \"kubernetes.io/projected/a2bd9787-6df4-492a-8cab-18201a143385-kube-api-access-wkgzm\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275386 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-service-ca\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275465 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-secret-volume\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275491 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e80a388-91b3-42f1-9ee2-70ab4850652d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275540 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/10eead56-2e9b-4d48-ab81-d1638b3cdddc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vh5ns\" (UID: \"10eead56-2e9b-4d48-ab81-d1638b3cdddc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275564 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-client\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275623 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5448e53f-3b74-47f1-9b28-705f36fd6ea3-images\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275647 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx7fb\" (UniqueName: \"kubernetes.io/projected/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-kube-api-access-fx7fb\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275697 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9rk4\" (UniqueName: \"kubernetes.io/projected/4fb67636-fcba-4975-a460-403cd6ee9c25-kube-api-access-d9rk4\") pod \"downloads-7954f5f757-8lrmb\" (UID: \"4fb67636-fcba-4975-a460-403cd6ee9c25\") " pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275724 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zmr8\" (UniqueName: \"kubernetes.io/projected/0779a70e-ebf5-4e98-87ea-43017b8d1e46-kube-api-access-2zmr8\") pod \"image-pruner-29551680-sz8pz\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275745 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2bd9787-6df4-492a-8cab-18201a143385-serving-cert\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275792 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28b5630a-9f96-453c-ac88-70d75b7d438d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275814 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/295cba62-fd24-4245-8773-866ee134a29e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275836 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-default-certificate\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275857 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0779a70e-ebf5-4e98-87ea-43017b8d1e46-serviceca\") pod \"image-pruner-29551680-sz8pz\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275909 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/295cba62-fd24-4245-8773-866ee134a29e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275932 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-service-ca-bundle\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275976 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276000 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8fxv\" (UniqueName: \"kubernetes.io/projected/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-kube-api-access-q8fxv\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276023 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsk4r\" (UniqueName: \"kubernetes.io/projected/964eb9a7-a580-44d4-b5e5-fe84d085823c-kube-api-access-tsk4r\") pod \"dns-operator-744455d44c-6sx4w\" (UID: \"964eb9a7-a580-44d4-b5e5-fe84d085823c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276048 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5448e53f-3b74-47f1-9b28-705f36fd6ea3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276088 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28b5630a-9f96-453c-ac88-70d75b7d438d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276111 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb21f66e-5c18-49bb-8146-8185434e7c2f-serving-cert\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276173 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276224 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-registry-tls\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276247 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-config-volume\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276268 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-bound-sa-token\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276288 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z95f9\" (UniqueName: \"kubernetes.io/projected/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-kube-api-access-z95f9\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276313 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276336 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-signing-cabundle\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276359 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-529vv\" (UniqueName: \"kubernetes.io/projected/5448e53f-3b74-47f1-9b28-705f36fd6ea3-kube-api-access-529vv\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276393 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5448e53f-3b74-47f1-9b28-705f36fd6ea3-proxy-tls\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276416 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-ca\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276454 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-stats-auth\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276475 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2bd9787-6df4-492a-8cab-18201a143385-config\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276500 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glwk4\" (UniqueName: \"kubernetes.io/projected/eb21f66e-5c18-49bb-8146-8185434e7c2f-kube-api-access-glwk4\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.279511 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.281297 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:52.781278505 +0000 UTC m=+206.954985254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.306742 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.311937 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m6jnx"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.314653 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.324674 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mzg85"] Mar 10 00:09:52 crc kubenswrapper[4994]: W0310 00:09:52.350932 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe6f59a5_bf2e_4926_b6f2_a18b4cb5479d.slice/crio-f8a9fca1baf9c26e7d4e01430c74bc08dfce2dc2a5bf912289541ecfef313f55 WatchSource:0}: Error finding container f8a9fca1baf9c26e7d4e01430c74bc08dfce2dc2a5bf912289541ecfef313f55: Status 404 returned error can't find the container with id f8a9fca1baf9c26e7d4e01430c74bc08dfce2dc2a5bf912289541ecfef313f55 Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.383196 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.383848 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384067 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/10eead56-2e9b-4d48-ab81-d1638b3cdddc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vh5ns\" (UID: \"10eead56-2e9b-4d48-ab81-d1638b3cdddc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384096 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-client\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384136 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5448e53f-3b74-47f1-9b28-705f36fd6ea3-images\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.384184 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:52.884158921 +0000 UTC m=+207.057865670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384186 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx7fb\" (UniqueName: \"kubernetes.io/projected/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-kube-api-access-fx7fb\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384251 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9rk4\" (UniqueName: \"kubernetes.io/projected/4fb67636-fcba-4975-a460-403cd6ee9c25-kube-api-access-d9rk4\") pod \"downloads-7954f5f757-8lrmb\" (UID: \"4fb67636-fcba-4975-a460-403cd6ee9c25\") " pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384278 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zmr8\" (UniqueName: \"kubernetes.io/projected/0779a70e-ebf5-4e98-87ea-43017b8d1e46-kube-api-access-2zmr8\") pod \"image-pruner-29551680-sz8pz\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384312 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2bd9787-6df4-492a-8cab-18201a143385-serving-cert\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384340 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-plugins-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384383 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28b5630a-9f96-453c-ac88-70d75b7d438d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384420 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/295cba62-fd24-4245-8773-866ee134a29e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384435 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-default-certificate\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384453 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0779a70e-ebf5-4e98-87ea-43017b8d1e46-serviceca\") pod \"image-pruner-29551680-sz8pz\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384467 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/40e41a4c-dba3-4862-9f06-c59c538785be-node-bootstrap-token\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384502 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/295cba62-fd24-4245-8773-866ee134a29e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384518 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-service-ca-bundle\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384537 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-socket-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384576 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384592 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8fxv\" (UniqueName: \"kubernetes.io/projected/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-kube-api-access-q8fxv\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384608 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsk4r\" (UniqueName: \"kubernetes.io/projected/964eb9a7-a580-44d4-b5e5-fe84d085823c-kube-api-access-tsk4r\") pod \"dns-operator-744455d44c-6sx4w\" (UID: \"964eb9a7-a580-44d4-b5e5-fe84d085823c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384623 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5448e53f-3b74-47f1-9b28-705f36fd6ea3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384664 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28b5630a-9f96-453c-ac88-70d75b7d438d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384679 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb21f66e-5c18-49bb-8146-8185434e7c2f-serving-cert\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384740 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384759 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tkcb\" (UniqueName: \"kubernetes.io/projected/7fd7640d-700a-420e-b15f-7f681090727b-kube-api-access-7tkcb\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384779 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-registry-tls\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384796 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-config-volume\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384828 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-bound-sa-token\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384844 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z95f9\" (UniqueName: \"kubernetes.io/projected/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-kube-api-access-z95f9\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384896 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384914 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-signing-cabundle\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384938 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-529vv\" (UniqueName: \"kubernetes.io/projected/5448e53f-3b74-47f1-9b28-705f36fd6ea3-kube-api-access-529vv\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384945 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/295cba62-fd24-4245-8773-866ee134a29e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384980 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5448e53f-3b74-47f1-9b28-705f36fd6ea3-proxy-tls\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384998 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fzzb\" (UniqueName: \"kubernetes.io/projected/40e41a4c-dba3-4862-9f06-c59c538785be-kube-api-access-6fzzb\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385025 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-ca\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385042 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-mountpoint-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385112 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-stats-auth\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385130 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84x8v\" (UniqueName: \"kubernetes.io/projected/22c138b8-4431-4695-be3f-0ea008d21f30-kube-api-access-84x8v\") pod \"ingress-canary-5hvbc\" (UID: \"22c138b8-4431-4695-be3f-0ea008d21f30\") " pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385147 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2bd9787-6df4-492a-8cab-18201a143385-config\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385176 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glwk4\" (UniqueName: \"kubernetes.io/projected/eb21f66e-5c18-49bb-8146-8185434e7c2f-kube-api-access-glwk4\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385205 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spglq\" (UniqueName: \"kubernetes.io/projected/10eead56-2e9b-4d48-ab81-d1638b3cdddc-kube-api-access-spglq\") pod \"multus-admission-controller-857f4d67dd-vh5ns\" (UID: \"10eead56-2e9b-4d48-ab81-d1638b3cdddc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385257 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-signing-key\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385278 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-config\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385297 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4fvw\" (UniqueName: \"kubernetes.io/projected/a1456dd8-5038-4bcc-8f19-51325ac84c02-kube-api-access-m4fvw\") pod \"auto-csr-approver-29551688-9zsf6\" (UID: \"a1456dd8-5038-4bcc-8f19-51325ac84c02\") " pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385367 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-metrics-certs\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385386 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28b5630a-9f96-453c-ac88-70d75b7d438d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385439 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngwjf\" (UniqueName: \"kubernetes.io/projected/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-kube-api-access-ngwjf\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385472 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/964eb9a7-a580-44d4-b5e5-fe84d085823c-metrics-tls\") pod \"dns-operator-744455d44c-6sx4w\" (UID: \"964eb9a7-a580-44d4-b5e5-fe84d085823c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385490 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r68q\" (UniqueName: \"kubernetes.io/projected/1e80a388-91b3-42f1-9ee2-70ab4850652d-kube-api-access-9r68q\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385509 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22c138b8-4431-4695-be3f-0ea008d21f30-cert\") pod \"ingress-canary-5hvbc\" (UID: \"22c138b8-4431-4695-be3f-0ea008d21f30\") " pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385535 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e80a388-91b3-42f1-9ee2-70ab4850652d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385549 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-csi-data-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385574 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbrkc\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-kube-api-access-kbrkc\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385612 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-registration-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385650 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e80a388-91b3-42f1-9ee2-70ab4850652d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385672 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-trusted-ca\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385706 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-registry-certificates\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385763 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkgzm\" (UniqueName: \"kubernetes.io/projected/a2bd9787-6df4-492a-8cab-18201a143385-kube-api-access-wkgzm\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385798 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-service-ca\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385846 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-secret-volume\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385870 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e80a388-91b3-42f1-9ee2-70ab4850652d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385933 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/40e41a4c-dba3-4862-9f06-c59c538785be-certs\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.390448 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2bd9787-6df4-492a-8cab-18201a143385-config\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.394351 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28b5630a-9f96-453c-ac88-70d75b7d438d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.395233 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5448e53f-3b74-47f1-9b28-705f36fd6ea3-images\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.395292 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/10eead56-2e9b-4d48-ab81-d1638b3cdddc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vh5ns\" (UID: \"10eead56-2e9b-4d48-ab81-d1638b3cdddc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.395421 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/964eb9a7-a580-44d4-b5e5-fe84d085823c-metrics-tls\") pod \"dns-operator-744455d44c-6sx4w\" (UID: \"964eb9a7-a580-44d4-b5e5-fe84d085823c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.396166 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5448e53f-3b74-47f1-9b28-705f36fd6ea3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.411346 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:52.911325092 +0000 UTC m=+207.085031841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.400006 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2bd9787-6df4-492a-8cab-18201a143385-serving-cert\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.412230 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-config-volume\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.412481 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-default-certificate\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.401026 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.412991 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28b5630a-9f96-453c-ac88-70d75b7d438d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.413952 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/295cba62-fd24-4245-8773-866ee134a29e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.415519 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e80a388-91b3-42f1-9ee2-70ab4850652d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.415965 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0779a70e-ebf5-4e98-87ea-43017b8d1e46-serviceca\") pod \"image-pruner-29551680-sz8pz\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.416247 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-metrics-certs\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.417494 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-registry-tls\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.419549 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-service-ca-bundle\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.425827 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-secret-volume\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.429710 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.429743 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-trusted-ca\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.429812 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e80a388-91b3-42f1-9ee2-70ab4850652d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.433565 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-signing-cabundle\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.434740 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5448e53f-3b74-47f1-9b28-705f36fd6ea3-proxy-tls\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.438294 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-signing-key\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.438812 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-config\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.438851 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-service-ca\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.438919 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-client\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.440739 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-ca\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.441345 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-registry-certificates\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.441913 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-stats-auth\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.442601 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9rk4\" (UniqueName: \"kubernetes.io/projected/4fb67636-fcba-4975-a460-403cd6ee9c25-kube-api-access-d9rk4\") pod \"downloads-7954f5f757-8lrmb\" (UID: \"4fb67636-fcba-4975-a460-403cd6ee9c25\") " pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.453005 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx7fb\" (UniqueName: \"kubernetes.io/projected/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-kube-api-access-fx7fb\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.470016 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.470720 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.474701 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb21f66e-5c18-49bb-8146-8185434e7c2f-serving-cert\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.480027 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.486137 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zmr8\" (UniqueName: \"kubernetes.io/projected/0779a70e-ebf5-4e98-87ea-43017b8d1e46-kube-api-access-2zmr8\") pod \"image-pruner-29551680-sz8pz\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.486931 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.487188 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:52.987166531 +0000 UTC m=+207.160873280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487344 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tkcb\" (UniqueName: \"kubernetes.io/projected/7fd7640d-700a-420e-b15f-7f681090727b-kube-api-access-7tkcb\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487373 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487416 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fzzb\" (UniqueName: \"kubernetes.io/projected/40e41a4c-dba3-4862-9f06-c59c538785be-kube-api-access-6fzzb\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487433 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-mountpoint-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487460 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84x8v\" (UniqueName: \"kubernetes.io/projected/22c138b8-4431-4695-be3f-0ea008d21f30-kube-api-access-84x8v\") pod \"ingress-canary-5hvbc\" (UID: \"22c138b8-4431-4695-be3f-0ea008d21f30\") " pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487541 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22c138b8-4431-4695-be3f-0ea008d21f30-cert\") pod \"ingress-canary-5hvbc\" (UID: \"22c138b8-4431-4695-be3f-0ea008d21f30\") " pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487563 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-csi-data-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487584 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-registration-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487619 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/40e41a4c-dba3-4862-9f06-c59c538785be-certs\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487648 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-plugins-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487712 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-mountpoint-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.488167 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:52.988152595 +0000 UTC m=+207.161859344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.488594 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-csi-data-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.488645 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-registration-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.488689 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-plugins-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.491012 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/40e41a4c-dba3-4862-9f06-c59c538785be-node-bootstrap-token\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.491087 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-socket-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.491436 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-socket-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.498010 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4fvw\" (UniqueName: \"kubernetes.io/projected/a1456dd8-5038-4bcc-8f19-51325ac84c02-kube-api-access-m4fvw\") pod \"auto-csr-approver-29551688-9zsf6\" (UID: \"a1456dd8-5038-4bcc-8f19-51325ac84c02\") " pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.498156 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.498541 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glwk4\" (UniqueName: \"kubernetes.io/projected/eb21f66e-5c18-49bb-8146-8185434e7c2f-kube-api-access-glwk4\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.498611 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/40e41a4c-dba3-4862-9f06-c59c538785be-certs\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.499400 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22c138b8-4431-4695-be3f-0ea008d21f30-cert\") pod \"ingress-canary-5hvbc\" (UID: \"22c138b8-4431-4695-be3f-0ea008d21f30\") " pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.499794 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/40e41a4c-dba3-4862-9f06-c59c538785be-node-bootstrap-token\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.500542 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.502739 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.514443 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8fxv\" (UniqueName: \"kubernetes.io/projected/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-kube-api-access-q8fxv\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.535828 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsk4r\" (UniqueName: \"kubernetes.io/projected/964eb9a7-a580-44d4-b5e5-fe84d085823c-kube-api-access-tsk4r\") pod \"dns-operator-744455d44c-6sx4w\" (UID: \"964eb9a7-a580-44d4-b5e5-fe84d085823c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.570383 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-bound-sa-token\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.592906 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.593540 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.093514074 +0000 UTC m=+207.267220823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.595705 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.596040 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spglq\" (UniqueName: \"kubernetes.io/projected/10eead56-2e9b-4d48-ab81-d1638b3cdddc-kube-api-access-spglq\") pod \"multus-admission-controller-857f4d67dd-vh5ns\" (UID: \"10eead56-2e9b-4d48-ab81-d1638b3cdddc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.608482 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r68q\" (UniqueName: \"kubernetes.io/projected/1e80a388-91b3-42f1-9ee2-70ab4850652d-kube-api-access-9r68q\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.622457 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.628470 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xm7j6"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.629550 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.631539 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.640305 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" event={"ID":"5937dfbb-0da7-439c-94cb-e0e1f658d464","Type":"ContainerStarted","Data":"8831fb3579ee9cdb923ec4d8c1aef2294468f33bab907bd5867001319278e8ad"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.642801 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" event={"ID":"1109e060-ef32-407d-8283-eba65e1d4eaa","Type":"ContainerStarted","Data":"c0c0f64e7f4625d4eaa33d78bcd1621a09cc39a5950f18848d396639c17eeac8"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.642903 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" event={"ID":"1109e060-ef32-407d-8283-eba65e1d4eaa","Type":"ContainerStarted","Data":"cd6bc53b06185452ec5ffc30b9345a8b0e9c6035b2724964bd1af174b6193e59"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.644335 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" event={"ID":"06c3fca2-64b6-47e2-885f-948eac331c10","Type":"ContainerStarted","Data":"73f546835be1acbcf5b5076338b328854de718376e12b8334ef017808360e5a8"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.644983 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkgzm\" (UniqueName: \"kubernetes.io/projected/a2bd9787-6df4-492a-8cab-18201a143385-kube-api-access-wkgzm\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.645651 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-529vv\" (UniqueName: \"kubernetes.io/projected/5448e53f-3b74-47f1-9b28-705f36fd6ea3-kube-api-access-529vv\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.645755 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" event={"ID":"54ca6ee4-24c4-415f-a1b6-26f54e2992f8","Type":"ContainerStarted","Data":"68e7ebb7b9a0fa967b84de70c209836439efd368a03ea8c0304dd46c8d9878be"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.651061 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.687269 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" event={"ID":"3a5ced5c-b690-4a1c-8d48-bbf789366816","Type":"ContainerStarted","Data":"a4f316c89cad6047ed47de9ae8a2fe8006cd6a1653d6e93a18547fa5d8c4c263"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.687920 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.694857 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbrkc\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-kube-api-access-kbrkc\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.695605 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.696049 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.19603645 +0000 UTC m=+207.369743199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.701685 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28b5630a-9f96-453c-ac88-70d75b7d438d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.709647 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngwjf\" (UniqueName: \"kubernetes.io/projected/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-kube-api-access-ngwjf\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.715138 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.722810 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.734805 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.739217 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" event={"ID":"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21","Type":"ContainerStarted","Data":"47cdd0fc5df08b42eefeda9e713ce29bf75c312c7780a55dff7b457658f9ecb2"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.739657 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e80a388-91b3-42f1-9ee2-70ab4850652d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.742410 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.748564 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" event={"ID":"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339","Type":"ContainerStarted","Data":"63fb8a0ac5bfc9c8de9bc34e4233fe302c0bdff0000c0a3baace5279815afe2c"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.754330 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" event={"ID":"6ed61f01-8d13-4883-ac58-0e998df5c20d","Type":"ContainerStarted","Data":"93848edcd94d1b8f74a090519d280473073418447e4977799669a8e2feb77dcb"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.754394 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" event={"ID":"6ed61f01-8d13-4883-ac58-0e998df5c20d","Type":"ContainerStarted","Data":"41d7b08bd6160997086ea083f385225cccd4bb088f59fd6fb1f00cf222b305b5"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.754976 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.755434 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z95f9\" (UniqueName: \"kubernetes.io/projected/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-kube-api-access-z95f9\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.755569 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgf68"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.757265 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" event={"ID":"903778b5-0c60-42d6-8773-a1345817fe1f","Type":"ContainerStarted","Data":"8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.757305 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" event={"ID":"903778b5-0c60-42d6-8773-a1345817fe1f","Type":"ContainerStarted","Data":"579d8e47d1cae1e88f269db0e29bcd43ee29c56b451d1f988d01fa0b8de660ec"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.757782 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.759931 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" event={"ID":"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d","Type":"ContainerStarted","Data":"f8a9fca1baf9c26e7d4e01430c74bc08dfce2dc2a5bf912289541ecfef313f55"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.772420 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-966nr"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.775587 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tkcb\" (UniqueName: \"kubernetes.io/projected/7fd7640d-700a-420e-b15f-7f681090727b-kube-api-access-7tkcb\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.779460 4994 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fxpkq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.779768 4994 patch_prober.go:28] interesting pod/console-operator-58897d9998-pwpc6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.779801 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" podUID="6ed61f01-8d13-4883-ac58-0e998df5c20d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.781696 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.779527 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.787186 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.799609 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.799963 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.299948031 +0000 UTC m=+207.473654780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.802012 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rlqtz"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.806105 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84x8v\" (UniqueName: \"kubernetes.io/projected/22c138b8-4431-4695-be3f-0ea008d21f30-kube-api-access-84x8v\") pod \"ingress-canary-5hvbc\" (UID: \"22c138b8-4431-4695-be3f-0ea008d21f30\") " pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.818110 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fzzb\" (UniqueName: \"kubernetes.io/projected/40e41a4c-dba3-4862-9f06-c59c538785be-kube-api-access-6fzzb\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.850072 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.885515 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.908836 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: W0310 00:09:52.911775 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb85bbdaa_daa8_4c69_abf9_9f1200eb07cd.slice/crio-18370ebb096ba95df384c4822d63f9eeb86d553280650a21652dc981b373eee5 WatchSource:0}: Error finding container 18370ebb096ba95df384c4822d63f9eeb86d553280650a21652dc981b373eee5: Status 404 returned error can't find the container with id 18370ebb096ba95df384c4822d63f9eeb86d553280650a21652dc981b373eee5 Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.912470 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.412444519 +0000 UTC m=+207.586151268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.940603 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: W0310 00:09:52.959192 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11b78073_cc4a_4a6f_89ab_631fde4b3371.slice/crio-463e9f6c752ae1494ce870d0e395342f9e1c8c71349ad766cac312fc6c1a6a3f WatchSource:0}: Error finding container 463e9f6c752ae1494ce870d0e395342f9e1c8c71349ad766cac312fc6c1a6a3f: Status 404 returned error can't find the container with id 463e9f6c752ae1494ce870d0e395342f9e1c8c71349ad766cac312fc6c1a6a3f Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.011739 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.011888 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.511842287 +0000 UTC m=+207.685549036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.012314 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.012818 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.512799541 +0000 UTC m=+207.686506290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.071571 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.105698 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.112896 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.144295 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.644260093 +0000 UTC m=+207.817966842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.190349 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wv4d4"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.227055 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.227328 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.727315832 +0000 UTC m=+207.901022581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.249926 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.315073 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.316413 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-725jp"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.322274 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.327958 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.328350 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.828325771 +0000 UTC m=+208.002032520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.331731 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.336390 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.336487 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.405728 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8lrmb"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.429709 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.430999 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.930985302 +0000 UTC m=+208.104692051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.437696 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" podStartSLOduration=162.437671269 podStartE2EDuration="2m42.437671269s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:53.433485645 +0000 UTC m=+207.607192414" watchObservedRunningTime="2026-03-10 00:09:53.437671269 +0000 UTC m=+207.611378018" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.447640 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hqlnc"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.456377 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.472156 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6sx4w"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.474430 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vh5ns"] Mar 10 00:09:53 crc kubenswrapper[4994]: W0310 00:09:53.491590 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod018c45cc_8cfa_497b_b6cf_25b10c694c58.slice/crio-793a5f9e3411b0af92f6a8ea8f03e0071d26aec82b46778cf3064a7e5ee1f344 WatchSource:0}: Error finding container 793a5f9e3411b0af92f6a8ea8f03e0071d26aec82b46778cf3064a7e5ee1f344: Status 404 returned error can't find the container with id 793a5f9e3411b0af92f6a8ea8f03e0071d26aec82b46778cf3064a7e5ee1f344 Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.531499 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.537093 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.037050988 +0000 UTC m=+208.210757787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.635152 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.639289 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.139274587 +0000 UTC m=+208.312981336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.639511 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.642883 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.644720 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.696220 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551688-9zsf6"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.708736 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.736936 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.737079 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.237058635 +0000 UTC m=+208.410765384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.737735 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.738249 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.238236015 +0000 UTC m=+208.411942764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.775907 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" event={"ID":"3a5ced5c-b690-4a1c-8d48-bbf789366816","Type":"ContainerStarted","Data":"0a60270474c65ca8a943a325db1195f8fb6a2c9bbb7e7276d1716f28a656e96e"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.780354 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" event={"ID":"964eb9a7-a580-44d4-b5e5-fe84d085823c","Type":"ContainerStarted","Data":"684c79ab646de835ef99f0fc4edc687ed78690b215f63c5c50ab9f33c1f56326"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.787053 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" event={"ID":"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5","Type":"ContainerStarted","Data":"a0cece20202ff106bc018ae86d1363c0e076439cd4076737afc7c948b110c656"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.795302 4994 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.802144 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cf2xx"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.803265 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" event={"ID":"54ca6ee4-24c4-415f-a1b6-26f54e2992f8","Type":"ContainerStarted","Data":"0f1267926bcca137db3abcb72a6c709ffad1b8249211d418fa61e8ee79ffda76"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.803780 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.804295 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29551680-sz8pz"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.804600 4994 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bkhqb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.804637 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.806123 4994 generic.go:334] "Generic (PLEG): container finished" podID="ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21" containerID="130fbf5ea61bd1e2ba5dc7dc75b09124c094f2fbf967022322ff72e08f29e934" exitCode=0 Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.806238 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" event={"ID":"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21","Type":"ContainerDied","Data":"130fbf5ea61bd1e2ba5dc7dc75b09124c094f2fbf967022322ff72e08f29e934"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.807247 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5hvbc"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.808710 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" event={"ID":"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8","Type":"ContainerStarted","Data":"e29a59b07d62557e79f131725545f7bbc14a1ca6dcc0ac4661855d156c889001"} Mar 10 00:09:53 crc kubenswrapper[4994]: W0310 00:09:53.809965 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5448e53f_3b74_47f1_9b28_705f36fd6ea3.slice/crio-4604c46a6249354e89435fdced2bee20c5ff8f5966359db2287131c85a900ae2 WatchSource:0}: Error finding container 4604c46a6249354e89435fdced2bee20c5ff8f5966359db2287131c85a900ae2: Status 404 returned error can't find the container with id 4604c46a6249354e89435fdced2bee20c5ff8f5966359db2287131c85a900ae2 Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.811616 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" event={"ID":"f15954a6-2036-4c32-a8b6-bc8e227d0fcd","Type":"ContainerStarted","Data":"862b0d860a139426a426969abefaaa8beec085d72eb016f7646e2a789f671e13"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.815469 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wv4d4" event={"ID":"1377f73a-df08-4450-afa1-960e15891141","Type":"ContainerStarted","Data":"f7e9ccdb2334bb8c1236d21a555661e8771e2ed311721152323ab9263229c9ad"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.819083 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" event={"ID":"10eead56-2e9b-4d48-ab81-d1638b3cdddc","Type":"ContainerStarted","Data":"34345e98c1e2e3aeb5d04ca6093bf31fe453de1f2bdaa615125c36c47cfd63b8"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.839099 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-47fkz" event={"ID":"40e41a4c-dba3-4862-9f06-c59c538785be","Type":"ContainerStarted","Data":"c731c52cc501053a9e6cb744a378eb523999983be44bf73047633cb630bcde4f"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.839410 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.839582 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.339555822 +0000 UTC m=+208.513262581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.839986 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.840483 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.340470785 +0000 UTC m=+208.514177534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.842339 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" event={"ID":"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a","Type":"ContainerStarted","Data":"576032bc27c1dd3be1fef9c613168d79e3e52a5fcf4088fa1693aa437e4e506b"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.846448 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" event={"ID":"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd","Type":"ContainerStarted","Data":"18370ebb096ba95df384c4822d63f9eeb86d553280650a21652dc981b373eee5"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.851566 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" podStartSLOduration=162.851545002 podStartE2EDuration="2m42.851545002s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:53.844644019 +0000 UTC m=+208.018350768" watchObservedRunningTime="2026-03-10 00:09:53.851545002 +0000 UTC m=+208.025251741" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.851864 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" event={"ID":"49f58ba5-3573-4894-a320-fcf4ca4e50f1","Type":"ContainerStarted","Data":"7e18e94ab3cc40d64e97f54b8dd1dbfa47c33d18a0cfd95fd87ea04af0af538b"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.853410 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" event={"ID":"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d","Type":"ContainerStarted","Data":"562b4f6ad03fbfc61eb6934ec8bb885e8e0d48326f7f4f305e46bd49329e3596"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.854262 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" event={"ID":"3ffed56d-e2ab-4fa9-9dac-98c382395f2f","Type":"ContainerStarted","Data":"884c06f0e4bbf6834679dd6a2a57beae9594323d06c650e327564e27b620f9af"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.854917 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" event={"ID":"018c45cc-8cfa-497b-b6cf-25b10c694c58","Type":"ContainerStarted","Data":"793a5f9e3411b0af92f6a8ea8f03e0071d26aec82b46778cf3064a7e5ee1f344"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.864193 4994 generic.go:334] "Generic (PLEG): container finished" podID="5937dfbb-0da7-439c-94cb-e0e1f658d464" containerID="2c8a718072fce38c319dd3e089acc3bde7592d06567f719b0ced5363b1177e38" exitCode=0 Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.864452 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" event={"ID":"5937dfbb-0da7-439c-94cb-e0e1f658d464","Type":"ContainerDied","Data":"2c8a718072fce38c319dd3e089acc3bde7592d06567f719b0ced5363b1177e38"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.866520 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" event={"ID":"28b5630a-9f96-453c-ac88-70d75b7d438d","Type":"ContainerStarted","Data":"09662085f4bd1674515195af789aa910121956af3b6df1fb85ec192328d054d3"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.876287 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" event={"ID":"4dbc1f03-d386-460b-81f5-e6b7d3630557","Type":"ContainerStarted","Data":"82f218da8fc6d991a5cd49e6b2a4ec167490c7cb89f8feb1f5d6a556f3224d1e"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.879811 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rlqtz" event={"ID":"11b78073-cc4a-4a6f-89ab-631fde4b3371","Type":"ContainerStarted","Data":"463e9f6c752ae1494ce870d0e395342f9e1c8c71349ad766cac312fc6c1a6a3f"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.882587 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" event={"ID":"0f2b43c0-a96d-4ea3-8d46-d6919aedf741","Type":"ContainerStarted","Data":"bae537ffa0b0b80c06d0b79407ab1e4733786fdf150560f197e315922a963b90"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.884185 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" event={"ID":"eb21f66e-5c18-49bb-8146-8185434e7c2f","Type":"ContainerStarted","Data":"6b16a5a1e886c45ca9e097a5676bc409e21a4ec9df85eba50d00f6e9744258e0"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.887444 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" event={"ID":"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d","Type":"ContainerStarted","Data":"59250ab52cbad73db0f4a15a56196056869f59efdeee140226e8cd658de70523"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.894355 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.916055 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" event={"ID":"06c3fca2-64b6-47e2-885f-948eac331c10","Type":"ContainerStarted","Data":"9d642500e4292360c763205ff2873028c27555cd29d6e96c3e3f459f6c03e625"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.919499 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" event={"ID":"29f559d0-b505-4855-91e3-e46804b0c9f1","Type":"ContainerStarted","Data":"9f81f43504d157e1822651c9895f8bb4d4e77aa9ab1f5a0a09943aa22704c4ad"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.922820 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" event={"ID":"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e","Type":"ContainerStarted","Data":"b4afdb10540d5ed0856b1c89767c0d9e08ce43a42e7723cf69f249b417909f11"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.926605 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x6s5d" event={"ID":"f9b1c3de-e5a3-467f-929b-afb8687fb7f0","Type":"ContainerStarted","Data":"7dd32d215b043f1e255d3b0e44386da08264c8468e9857a042c5c799844f7ee2"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.929604 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lrmb" event={"ID":"4fb67636-fcba-4975-a460-403cd6ee9c25","Type":"ContainerStarted","Data":"36382d3e5b130d311923877c1260293bc964d90ead9bdf7dbab11f1738111946"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.938716 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" event={"ID":"9a1c67e3-f6df-4b4d-b3a3-669503580446","Type":"ContainerStarted","Data":"4fad456af3276012ecddb69fdf5089880e2b26cd92986f3e857c11107a4952b1"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.940960 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.941409 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.44133772 +0000 UTC m=+208.615044469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.941734 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.942837 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.442820887 +0000 UTC m=+208.616527646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.960094 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pvfj5"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.963198 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.972973 4994 generic.go:334] "Generic (PLEG): container finished" podID="4c9cbda0-655c-4cf9-8f9a-23b3ebf37339" containerID="8e871a87e6dfc98301313edad560135c0d1c2f67516af3aa4d21ffc05dcf1c75" exitCode=0 Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.973696 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" event={"ID":"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339","Type":"ContainerDied","Data":"8e871a87e6dfc98301313edad560135c0d1c2f67516af3aa4d21ffc05dcf1c75"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.976518 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" event={"ID":"2204937d-9632-46e6-8f26-0cea8593d1a5","Type":"ContainerStarted","Data":"9819e9aedbb8ff5e71ef5f8d74f93a1e29204f606152756297a69592fbda127a"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.977072 4994 patch_prober.go:28] interesting pod/console-operator-58897d9998-pwpc6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.977118 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" podUID="6ed61f01-8d13-4883-ac58-0e998df5c20d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.977155 4994 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fxpkq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.977190 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Mar 10 00:09:53 crc kubenswrapper[4994]: W0310 00:09:53.997070 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ad3539f_9691_4344_9c7f_1b015c5e3b3d.slice/crio-c29635a43a370ddcef70193ceb4b233fcaa550295f8e461c5e67182d4585854b WatchSource:0}: Error finding container c29635a43a370ddcef70193ceb4b233fcaa550295f8e461c5e67182d4585854b: Status 404 returned error can't find the container with id c29635a43a370ddcef70193ceb4b233fcaa550295f8e461c5e67182d4585854b Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.042661 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.042906 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.542854262 +0000 UTC m=+208.716561051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.043283 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.043708 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.543697773 +0000 UTC m=+208.717404522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.144414 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.146200 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.646174749 +0000 UTC m=+208.819881558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.246057 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.246327 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.746317397 +0000 UTC m=+208.920024146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.346638 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.347019 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.846998837 +0000 UTC m=+209.020705596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.347348 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.347726 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.847715475 +0000 UTC m=+209.021422224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.447076 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" podStartSLOduration=163.447055872 podStartE2EDuration="2m43.447055872s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:54.443192296 +0000 UTC m=+208.616899045" watchObservedRunningTime="2026-03-10 00:09:54.447055872 +0000 UTC m=+208.620762621" Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.448273 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.451113 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.951091094 +0000 UTC m=+209.124797843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.553715 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.554298 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.054269017 +0000 UTC m=+209.227975756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.556159 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" podStartSLOduration=162.556136193 podStartE2EDuration="2m42.556136193s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:54.552309558 +0000 UTC m=+208.726016307" watchObservedRunningTime="2026-03-10 00:09:54.556136193 +0000 UTC m=+208.729842942" Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.656329 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.656516 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.156490756 +0000 UTC m=+209.330197505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.657642 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.657986 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.157977734 +0000 UTC m=+209.331684483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.761381 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.761463 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.261447114 +0000 UTC m=+209.435153863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.761658 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.762219 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.262206743 +0000 UTC m=+209.435913492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.862448 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.862580 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.362558826 +0000 UTC m=+209.536265595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.863222 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.864129 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.363813107 +0000 UTC m=+209.537519856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.964080 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.964528 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.464510418 +0000 UTC m=+209.638217167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.983627 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rlqtz" event={"ID":"11b78073-cc4a-4a6f-89ab-631fde4b3371","Type":"ContainerStarted","Data":"fdab6275ec494362430269075beb94b8c702e2909589a9fa3cd20488ebec44d8"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.987613 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x6s5d" event={"ID":"f9b1c3de-e5a3-467f-929b-afb8687fb7f0","Type":"ContainerStarted","Data":"c90a584826553fb02bd29517277e0087eee6110714e60770bda7cb8fa9371309"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.990103 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" event={"ID":"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5","Type":"ContainerStarted","Data":"2fd88818eb9689cd277d2283b6033829ffd881934c4abd7ae705c1ecc7e0971a"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.990620 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.993132 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" event={"ID":"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e","Type":"ContainerStarted","Data":"b7d0030c69ad94cc9b98f04701b2bb8ba0e5f4ea4b650053785878dc81668b7a"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.993163 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.994295 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-47fkz" event={"ID":"40e41a4c-dba3-4862-9f06-c59c538785be","Type":"ContainerStarted","Data":"8028ad0ad8a4d65aab0a9577945312f04258071094f29460c386e8be32477640"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.995605 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" event={"ID":"a1456dd8-5038-4bcc-8f19-51325ac84c02","Type":"ContainerStarted","Data":"5834bb42526a131e58f316ca58bdd93f998331f2c74c80c7a568c0b2a5d292c8"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.997090 4994 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-d8g97 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" start-of-body= Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.997130 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" podUID="70d3076a-1af2-4aed-93ac-8dbbebd7e7d5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.998072 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" event={"ID":"3ffed56d-e2ab-4fa9-9dac-98c382395f2f","Type":"ContainerStarted","Data":"d7362fbe2cd97e9688561fa1ac7012eeb3d374e2fe9d4405c7e37ab649db642c"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.998051 4994 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-c4cdv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.998117 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" podUID="5cc67063-d02f-4cb9-a15d-0d0a5c457e6e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.001718 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29551680-sz8pz" event={"ID":"0779a70e-ebf5-4e98-87ea-43017b8d1e46","Type":"ContainerStarted","Data":"b48d5517182b7a9629abeaefea2ce0d25137af9372c04a623cb57c0cb0fada84"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.005509 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" event={"ID":"964eb9a7-a580-44d4-b5e5-fe84d085823c","Type":"ContainerStarted","Data":"9dbd79c8576252b697bc5e7472d200f7ca1e77cfe0295522d819a7399fa940ea"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.005831 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rlqtz" podStartSLOduration=164.005819153 podStartE2EDuration="2m44.005819153s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.000362376 +0000 UTC m=+209.174069135" watchObservedRunningTime="2026-03-10 00:09:55.005819153 +0000 UTC m=+209.179525902" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.007410 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" event={"ID":"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a","Type":"ContainerStarted","Data":"20a0c91236d538b3fa1745ca92b3a27af099486933c6d89f56542107be20f542"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.008634 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" event={"ID":"10eead56-2e9b-4d48-ab81-d1638b3cdddc","Type":"ContainerStarted","Data":"2462c46e1785c092405d62542e8d0b00b229bb8e630d5e0207a2f5dcf487a459"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.009543 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" event={"ID":"9a1c67e3-f6df-4b4d-b3a3-669503580446","Type":"ContainerStarted","Data":"03ebf9cde09f85323fd76e933846dfc1dfd8ba1c198723c3174424b6bd6a1144"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.010616 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" event={"ID":"29f559d0-b505-4855-91e3-e46804b0c9f1","Type":"ContainerStarted","Data":"1fda82ed2dd75e4f1a15333c3f84108c6f138545e24043e618d968852d6e4eff"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.012369 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" event={"ID":"2204937d-9632-46e6-8f26-0cea8593d1a5","Type":"ContainerStarted","Data":"1560d54b45e99a47fa9666b7502cda6fe98263942b080cacdf734ec5737064c9"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.014094 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" event={"ID":"eb21f66e-5c18-49bb-8146-8185434e7c2f","Type":"ContainerStarted","Data":"6a43d5b7a3bb741ebbd18604c53964ec33e95cda9a8f13a07191480dc1f1e5d5"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.015527 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" event={"ID":"018c45cc-8cfa-497b-b6cf-25b10c694c58","Type":"ContainerStarted","Data":"d50670bd2c632b0f5afa80cc22121a3bdfc8020046cf4990c915ae61736f1197"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.019216 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" event={"ID":"0ad3539f-9691-4344-9c7f-1b015c5e3b3d","Type":"ContainerStarted","Data":"dcaefeae6fdcccd19f234f83b7d51bb8fd068722145409530bbf2ffa137202bd"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.019261 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" event={"ID":"0ad3539f-9691-4344-9c7f-1b015c5e3b3d","Type":"ContainerStarted","Data":"c29635a43a370ddcef70193ceb4b233fcaa550295f8e461c5e67182d4585854b"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.024926 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" podStartSLOduration=163.024906391 podStartE2EDuration="2m43.024906391s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.015808122 +0000 UTC m=+209.189514891" watchObservedRunningTime="2026-03-10 00:09:55.024906391 +0000 UTC m=+209.198613140" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.025235 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5hvbc" event={"ID":"22c138b8-4431-4695-be3f-0ea008d21f30","Type":"ContainerStarted","Data":"42057946215ffa6c798c8b0b4ea49811de3193e1f9b262beec1c0d1cc4a4a037"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.025296 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5hvbc" event={"ID":"22c138b8-4431-4695-be3f-0ea008d21f30","Type":"ContainerStarted","Data":"05e3932997f8b46fbf348ee0934f66ab6c46ae09a1f9db2c21f99ea679745e81"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.028578 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" event={"ID":"f15954a6-2036-4c32-a8b6-bc8e227d0fcd","Type":"ContainerStarted","Data":"34d25c6a2ee4eeee418c3880eacd2750fa818e7f6c4a58ca8767f5904cb57411"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.035540 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" podStartSLOduration=163.035522506 podStartE2EDuration="2m43.035522506s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.033918726 +0000 UTC m=+209.207625485" watchObservedRunningTime="2026-03-10 00:09:55.035522506 +0000 UTC m=+209.209229255" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.037609 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" event={"ID":"0f2b43c0-a96d-4ea3-8d46-d6919aedf741","Type":"ContainerStarted","Data":"f1e1d34ce7a507537fd76dd423d8de4db96972c855db85d1fc738caf85d29e98"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.040007 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" event={"ID":"5448e53f-3b74-47f1-9b28-705f36fd6ea3","Type":"ContainerStarted","Data":"24c1fefc2882e1802747b7b3ebf67c3834e12e98460d60417f7a36bc60de9cb1"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.040083 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" event={"ID":"5448e53f-3b74-47f1-9b28-705f36fd6ea3","Type":"ContainerStarted","Data":"4604c46a6249354e89435fdced2bee20c5ff8f5966359db2287131c85a900ae2"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.041044 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" event={"ID":"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4","Type":"ContainerStarted","Data":"4f78110283db462476124633cbb5e70ba67655950f51a76567b14aad17ab4e79"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.043969 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" event={"ID":"1109e060-ef32-407d-8283-eba65e1d4eaa","Type":"ContainerStarted","Data":"24f1352c9336dd3ec53fc571a9de8e74008c187b3edad6056ea5e61514891ec0"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.047530 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" event={"ID":"a2bd9787-6df4-492a-8cab-18201a143385","Type":"ContainerStarted","Data":"892ce50aa95222df79ba6f81fa34311ba310427496cc4a27cb6a4523c8d16520"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.050943 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" event={"ID":"1e80a388-91b3-42f1-9ee2-70ab4850652d","Type":"ContainerStarted","Data":"daab938e9c2fed71fd9ddd4943fb96f2cc1ac97606c4f53965a269a0ac0033ef"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.052269 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-x6s5d" podStartSLOduration=164.052233624 podStartE2EDuration="2m44.052233624s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.051540477 +0000 UTC m=+209.225247246" watchObservedRunningTime="2026-03-10 00:09:55.052233624 +0000 UTC m=+209.225940373" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.054459 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" event={"ID":"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d","Type":"ContainerStarted","Data":"f0af0f093c6ecb03be70c52a8245ae76e3b834b8e24a7ea4b7b1aa74d2c3d221"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.055746 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" event={"ID":"49f58ba5-3573-4894-a320-fcf4ca4e50f1","Type":"ContainerStarted","Data":"688973cae337fece4be2344fdc1fac73b3b7502d0d929e7d31d3ba244653b544"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.056011 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.057202 4994 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bkq7b container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.057241 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" podUID="49f58ba5-3573-4894-a320-fcf4ca4e50f1" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.057611 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lrmb" event={"ID":"4fb67636-fcba-4975-a460-403cd6ee9c25","Type":"ContainerStarted","Data":"131f6f968be699b4510e1711ff70f7d98fd24e9b749c0ac094982fa64eb070f5"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.058021 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.060374 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.060409 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.065570 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.065936 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.565925057 +0000 UTC m=+209.739631806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.067558 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-47fkz" podStartSLOduration=6.067538328 podStartE2EDuration="6.067538328s" podCreationTimestamp="2026-03-10 00:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.064592584 +0000 UTC m=+209.238299343" watchObservedRunningTime="2026-03-10 00:09:55.067538328 +0000 UTC m=+209.241245077" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.070387 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wv4d4" event={"ID":"1377f73a-df08-4450-afa1-960e15891141","Type":"ContainerStarted","Data":"80b6d2bc1e285aeb1ca7ae7979928b3e2d6c3d1c1772bfc23e9a845a4b245bc5"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.071693 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" event={"ID":"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd","Type":"ContainerStarted","Data":"8746dfac88fe15ceee6b052e166cd28bfb74ffbe54cfe6f00bf09d8a12e889fd"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.073982 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.075579 4994 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tgf68 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.075622 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.076422 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" event={"ID":"4dbc1f03-d386-460b-81f5-e6b7d3630557","Type":"ContainerStarted","Data":"5394779b6c263bbb8c6c5314221b41c0f79a1573e13a61c85573645f35a787c2"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.078320 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" event={"ID":"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d","Type":"ContainerStarted","Data":"c7f42fde54c7c039a01a7870000533ca9c1566739cc9d63a752c5d7902d1cb89"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.079102 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" event={"ID":"7fd7640d-700a-420e-b15f-7f681090727b","Type":"ContainerStarted","Data":"14861c8d1aac432a04ec6fccb527599ebf8d5d4d4e7d239ed4bcca6f9ca85ce6"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.080504 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" event={"ID":"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339","Type":"ContainerStarted","Data":"36fc7c2c9305493a786c31a0378a6b14c114abc68b4a313dee8e717fec27dc9f"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.082004 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" event={"ID":"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8","Type":"ContainerStarted","Data":"ce0e0fe899887c51b11ad46c4b9de66da2a57ba53ab4c60cd515a10d07afe0f7"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.082522 4994 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bkhqb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.082649 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.082733 4994 patch_prober.go:28] interesting pod/console-operator-58897d9998-pwpc6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.082776 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" podUID="6ed61f01-8d13-4883-ac58-0e998df5c20d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.091176 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" podStartSLOduration=163.091152049 podStartE2EDuration="2m43.091152049s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.090272767 +0000 UTC m=+209.263979536" watchObservedRunningTime="2026-03-10 00:09:55.091152049 +0000 UTC m=+209.264858808" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.107215 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8lrmb" podStartSLOduration=164.107195011 podStartE2EDuration="2m44.107195011s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.104731689 +0000 UTC m=+209.278438448" watchObservedRunningTime="2026-03-10 00:09:55.107195011 +0000 UTC m=+209.280901760" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.125337 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" podStartSLOduration=164.125312324 podStartE2EDuration="2m44.125312324s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.11752414 +0000 UTC m=+209.291230889" watchObservedRunningTime="2026-03-10 00:09:55.125312324 +0000 UTC m=+209.299019083" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.137465 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" podStartSLOduration=164.137449049 podStartE2EDuration="2m44.137449049s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.135380887 +0000 UTC m=+209.309087636" watchObservedRunningTime="2026-03-10 00:09:55.137449049 +0000 UTC m=+209.311155798" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.155169 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" podStartSLOduration=164.155152712 podStartE2EDuration="2m44.155152712s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.154188818 +0000 UTC m=+209.327895577" watchObservedRunningTime="2026-03-10 00:09:55.155152712 +0000 UTC m=+209.328859461" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.166519 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.166701 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.66667821 +0000 UTC m=+209.840384959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.167009 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.168273 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.66825499 +0000 UTC m=+209.841961749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.218855 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" podStartSLOduration=164.218827246 podStartE2EDuration="2m44.218827246s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.185314347 +0000 UTC m=+209.359021116" watchObservedRunningTime="2026-03-10 00:09:55.218827246 +0000 UTC m=+209.392534015" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.222215 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" podStartSLOduration=164.22220271 podStartE2EDuration="2m44.22220271s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.20219835 +0000 UTC m=+209.375905129" watchObservedRunningTime="2026-03-10 00:09:55.22220271 +0000 UTC m=+209.395909480" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.231618 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" podStartSLOduration=164.231591765 podStartE2EDuration="2m44.231591765s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.224238472 +0000 UTC m=+209.397945271" watchObservedRunningTime="2026-03-10 00:09:55.231591765 +0000 UTC m=+209.405298514" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.242510 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" podStartSLOduration=164.242479078 podStartE2EDuration="2m44.242479078s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.240464168 +0000 UTC m=+209.414170917" watchObservedRunningTime="2026-03-10 00:09:55.242479078 +0000 UTC m=+209.416185827" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.255216 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" podStartSLOduration=164.255199447 podStartE2EDuration="2m44.255199447s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.253038953 +0000 UTC m=+209.426745712" watchObservedRunningTime="2026-03-10 00:09:55.255199447 +0000 UTC m=+209.428906196" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.269205 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.269601 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.769573577 +0000 UTC m=+209.943280386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.270011 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.271400 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.771388492 +0000 UTC m=+209.945095321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.276611 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" podStartSLOduration=164.276586672 podStartE2EDuration="2m44.276586672s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.268816278 +0000 UTC m=+209.442523027" watchObservedRunningTime="2026-03-10 00:09:55.276586672 +0000 UTC m=+209.450293421" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.294649 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podStartSLOduration=163.294630954 podStartE2EDuration="2m43.294630954s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.293919997 +0000 UTC m=+209.467626796" watchObservedRunningTime="2026-03-10 00:09:55.294630954 +0000 UTC m=+209.468337703" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.371258 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.371601 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.87157519 +0000 UTC m=+210.045281959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.372677 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.373518 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.873507199 +0000 UTC m=+210.047213948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.473906 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.474748 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.974728333 +0000 UTC m=+210.148435082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.576609 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.577139 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.077100407 +0000 UTC m=+210.250807206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.679637 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.679935 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.17989469 +0000 UTC m=+210.353601439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.680703 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.681057 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.181045869 +0000 UTC m=+210.354752618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.782753 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.783115 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.283090514 +0000 UTC m=+210.456797263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.783351 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.783748 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.283739621 +0000 UTC m=+210.457446370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.850693 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.853098 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.853189 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.885174 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.885748 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.385714894 +0000 UTC m=+210.559421643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.886496 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.886998 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.386979025 +0000 UTC m=+210.560685774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.987351 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.987491 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.487464681 +0000 UTC m=+210.661171460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.988228 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.988858 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.488828576 +0000 UTC m=+210.662535495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.089411 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.089559 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.589537167 +0000 UTC m=+210.763243916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.089598 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.089992 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.589980108 +0000 UTC m=+210.763686857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.091375 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wv4d4" event={"ID":"1377f73a-df08-4450-afa1-960e15891141","Type":"ContainerStarted","Data":"f965e4bdb02543038bc4a981e562cddd7e381bb2d24ba9a040b99b6e1e92416d"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.092113 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.094977 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" event={"ID":"3ffed56d-e2ab-4fa9-9dac-98c382395f2f","Type":"ContainerStarted","Data":"00929d6fc9c599b907598b3546f56ee130547c1395ea2c4cb9be0fe6da4999a6"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.096955 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29551680-sz8pz" event={"ID":"0779a70e-ebf5-4e98-87ea-43017b8d1e46","Type":"ContainerStarted","Data":"a4cfa8b96c6aa5123624fb879c0f68820a0d96a764fb960d5b7561f433ae5dad"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.098366 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" event={"ID":"28b5630a-9f96-453c-ac88-70d75b7d438d","Type":"ContainerStarted","Data":"ca2bf1116193219cdd988fc6185e54ae625be4d81126fe0d0819e83f847bd653"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.100787 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" event={"ID":"10eead56-2e9b-4d48-ab81-d1638b3cdddc","Type":"ContainerStarted","Data":"3e518552546dce36ddb039ca2542fbbaa1c3fe1c2b7ebbff223c196e516b986d"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.102893 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" event={"ID":"964eb9a7-a580-44d4-b5e5-fe84d085823c","Type":"ContainerStarted","Data":"4a6b7543dffaf97e57a78e1f116971619d8dd0a2bd6e5852c145f4ec4559ba38"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.104975 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" event={"ID":"5937dfbb-0da7-439c-94cb-e0e1f658d464","Type":"ContainerStarted","Data":"b2622b6e369b3bc1c122bb59b37130409b242aeb0a819b1f2e4fe178f09fd834"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.106210 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" event={"ID":"a2bd9787-6df4-492a-8cab-18201a143385","Type":"ContainerStarted","Data":"234cdec71c533bc815a74e577d193619fa75ef730b685db920e9eb073edb3fdb"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.108785 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wv4d4" podStartSLOduration=7.108768279 podStartE2EDuration="7.108768279s" podCreationTimestamp="2026-03-10 00:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.106803919 +0000 UTC m=+210.280510698" watchObservedRunningTime="2026-03-10 00:09:56.108768279 +0000 UTC m=+210.282475028" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.110328 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" event={"ID":"2204937d-9632-46e6-8f26-0cea8593d1a5","Type":"ContainerStarted","Data":"4d243ddd65462760dbee4f1651ddaa4809dfeb893cd7cdf23d37cbd6b811484f"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.111016 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.116072 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" event={"ID":"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21","Type":"ContainerStarted","Data":"b3e88a3c77c3dbe81c13b71d89325919a585d6d800ed9ee595d2dd5b462d8747"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.118703 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" event={"ID":"9a1c67e3-f6df-4b4d-b3a3-669503580446","Type":"ContainerStarted","Data":"399e02d23d05e73f7894f67ce3daf4adc1113c0ce6603248bb5fb54dd2e96ce0"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.123262 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" podStartSLOduration=165.123230291 podStartE2EDuration="2m45.123230291s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.119469976 +0000 UTC m=+210.293176725" watchObservedRunningTime="2026-03-10 00:09:56.123230291 +0000 UTC m=+210.296937030" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.128744 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" event={"ID":"018c45cc-8cfa-497b-b6cf-25b10c694c58","Type":"ContainerStarted","Data":"97066324072ecc4daa4504d93999ce14cec65927cbefe353672ed493b9910920"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.132381 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" event={"ID":"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4","Type":"ContainerStarted","Data":"15bae0b264501d6183ac485bbda4701175856674d4622471a6212abc557f5d2a"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.135755 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" event={"ID":"1e80a388-91b3-42f1-9ee2-70ab4850652d","Type":"ContainerStarted","Data":"37b3d7d2c792686c78d21e85d376bbb0ce24db71e2ebde23bcca29f40317cd89"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.138307 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" event={"ID":"5448e53f-3b74-47f1-9b28-705f36fd6ea3","Type":"ContainerStarted","Data":"93d8e1d51b5bce53cefebc9c4a8bc10bc17991737eb9b549fabd5d8e6c562d61"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.143277 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" podStartSLOduration=164.143259372 podStartE2EDuration="2m44.143259372s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.139302422 +0000 UTC m=+210.313009171" watchObservedRunningTime="2026-03-10 00:09:56.143259372 +0000 UTC m=+210.316966131" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.145479 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" event={"ID":"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a","Type":"ContainerStarted","Data":"6e6a1a4eefe4c6e2cbbd5fc27ecdc34d61c889dd0df49be92721df7bed5fca11"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146083 4994 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tgf68 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146143 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146179 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146227 4994 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bkq7b container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146237 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146259 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" podUID="49f58ba5-3573-4894-a320-fcf4ca4e50f1" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146427 4994 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-c4cdv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146458 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" podUID="5cc67063-d02f-4cb9-a15d-0d0a5c457e6e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146722 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146771 4994 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-d8g97 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146794 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" podUID="70d3076a-1af2-4aed-93ac-8dbbebd7e7d5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.153134 4994 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xm7j6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.153183 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.190472 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.192750 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.692728261 +0000 UTC m=+210.866435020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.195091 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.195426 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.695410127 +0000 UTC m=+210.869116886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.238653 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" podStartSLOduration=164.23863539 podStartE2EDuration="2m44.23863539s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.238110407 +0000 UTC m=+210.411817156" watchObservedRunningTime="2026-03-10 00:09:56.23863539 +0000 UTC m=+210.412342139" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.263771 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" podStartSLOduration=165.263757919 podStartE2EDuration="2m45.263757919s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.26219617 +0000 UTC m=+210.435902919" watchObservedRunningTime="2026-03-10 00:09:56.263757919 +0000 UTC m=+210.437464668" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.303932 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.304228 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29551680-sz8pz" podStartSLOduration=165.304213892 podStartE2EDuration="2m45.304213892s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.303701369 +0000 UTC m=+210.477408118" watchObservedRunningTime="2026-03-10 00:09:56.304213892 +0000 UTC m=+210.477920641" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.305260 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.805239528 +0000 UTC m=+210.978946277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.325765 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" podStartSLOduration=164.325742431 podStartE2EDuration="2m44.325742431s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.323754891 +0000 UTC m=+210.497461650" watchObservedRunningTime="2026-03-10 00:09:56.325742431 +0000 UTC m=+210.499449180" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.349188 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" podStartSLOduration=165.349168847 podStartE2EDuration="2m45.349168847s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.348735127 +0000 UTC m=+210.522441886" watchObservedRunningTime="2026-03-10 00:09:56.349168847 +0000 UTC m=+210.522875596" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.370670 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" podStartSLOduration=165.370650075 podStartE2EDuration="2m45.370650075s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.36641091 +0000 UTC m=+210.540117669" watchObservedRunningTime="2026-03-10 00:09:56.370650075 +0000 UTC m=+210.544356874" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.405507 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" podStartSLOduration=164.405486087 podStartE2EDuration="2m44.405486087s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.392265157 +0000 UTC m=+210.565971916" watchObservedRunningTime="2026-03-10 00:09:56.405486087 +0000 UTC m=+210.579192836" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.407341 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" podStartSLOduration=165.407331394 podStartE2EDuration="2m45.407331394s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.404670117 +0000 UTC m=+210.578376866" watchObservedRunningTime="2026-03-10 00:09:56.407331394 +0000 UTC m=+210.581038143" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.409786 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.410180 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.910165915 +0000 UTC m=+211.083872664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.424079 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" podStartSLOduration=164.424061123 podStartE2EDuration="2m44.424061123s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.420287909 +0000 UTC m=+210.593994658" watchObservedRunningTime="2026-03-10 00:09:56.424061123 +0000 UTC m=+210.597767872" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.443224 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" podStartSLOduration=164.443207262 podStartE2EDuration="2m44.443207262s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.440585827 +0000 UTC m=+210.614292596" watchObservedRunningTime="2026-03-10 00:09:56.443207262 +0000 UTC m=+210.616914011" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.457514 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5hvbc" podStartSLOduration=7.45749843 podStartE2EDuration="7.45749843s" podCreationTimestamp="2026-03-10 00:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.456472085 +0000 UTC m=+210.630178834" watchObservedRunningTime="2026-03-10 00:09:56.45749843 +0000 UTC m=+210.631205169" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.507716 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" podStartSLOduration=165.507698537 podStartE2EDuration="2m45.507698537s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.50583418 +0000 UTC m=+210.679540929" watchObservedRunningTime="2026-03-10 00:09:56.507698537 +0000 UTC m=+210.681405286" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.508915 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" podStartSLOduration=165.508908617 podStartE2EDuration="2m45.508908617s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.484829585 +0000 UTC m=+210.658536344" watchObservedRunningTime="2026-03-10 00:09:56.508908617 +0000 UTC m=+210.682615366" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.510239 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.510373 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.010358084 +0000 UTC m=+211.184064833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.510470 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.510493 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.510534 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.510616 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.511287 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.011270766 +0000 UTC m=+211.184977515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.522045 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.523108 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.538661 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.555601 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" podStartSLOduration=164.555584156 podStartE2EDuration="2m44.555584156s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.533489192 +0000 UTC m=+210.707195941" watchObservedRunningTime="2026-03-10 00:09:56.555584156 +0000 UTC m=+210.729290905" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.559091 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" podStartSLOduration=165.559084823 podStartE2EDuration="2m45.559084823s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.55375136 +0000 UTC m=+210.727458109" watchObservedRunningTime="2026-03-10 00:09:56.559084823 +0000 UTC m=+210.732791562" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.611494 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.611618 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.611676 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.612014 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.111987538 +0000 UTC m=+211.285694287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.616410 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.616620 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.691598 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.712270 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.712674 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.212658949 +0000 UTC m=+211.386365698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.713643 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.726720 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.736683 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.787128 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.787830 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.791381 4994 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-xtfzl container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.791488 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" podUID="5937dfbb-0da7-439c-94cb-e0e1f658d464" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.812994 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.813197 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.313167725 +0000 UTC m=+211.486874474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.813456 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.813794 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.31378344 +0000 UTC m=+211.487490189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.853254 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.853356 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.914412 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.914600 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.414576244 +0000 UTC m=+211.588282993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.914755 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.915148 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.415133078 +0000 UTC m=+211.588839827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.965233 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vxjt2"] Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.015560 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.015817 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.515800898 +0000 UTC m=+211.689507647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: W0310 00:09:57.056158 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-8db223696e8ca4d3c62124625f1bdb16b634b5c0f256b76f7d08b7a96df3ed89 WatchSource:0}: Error finding container 8db223696e8ca4d3c62124625f1bdb16b634b5c0f256b76f7d08b7a96df3ed89: Status 404 returned error can't find the container with id 8db223696e8ca4d3c62124625f1bdb16b634b5c0f256b76f7d08b7a96df3ed89 Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.117926 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.118303 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.618288295 +0000 UTC m=+211.791995044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.154741 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8db223696e8ca4d3c62124625f1bdb16b634b5c0f256b76f7d08b7a96df3ed89"} Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.160477 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" event={"ID":"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21","Type":"ContainerStarted","Data":"1b58b06db75022f3b55ce331168196e97a6d8615bab038fd3308b681b7b3452c"} Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.166701 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" event={"ID":"f4c125b3-4a9c-46a7-a468-54e93c44751d","Type":"ContainerStarted","Data":"1d5629d59e513cf0201f405d88fdc5c855901aca9fe25c7a180bf3d7ba725320"} Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.168314 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.168355 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.172982 4994 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xm7j6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.173007 4994 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tgf68 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.173027 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.173056 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.198359 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" podStartSLOduration=166.198339809 podStartE2EDuration="2m46.198339809s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:57.180888292 +0000 UTC m=+211.354595041" watchObservedRunningTime="2026-03-10 00:09:57.198339809 +0000 UTC m=+211.372046558" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.200059 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" podStartSLOduration=166.200055472 podStartE2EDuration="2m46.200055472s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:57.197330114 +0000 UTC m=+211.371036863" watchObservedRunningTime="2026-03-10 00:09:57.200055472 +0000 UTC m=+211.373762221" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.219266 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.219367 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.719350365 +0000 UTC m=+211.893057114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.220739 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.222390 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.722380641 +0000 UTC m=+211.896087390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.321549 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.321787 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.82176656 +0000 UTC m=+211.995473309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: W0310 00:09:57.322605 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-cfaf89a2969de096994cddd0555263d037fee8a7bf136c9d71487a28ab94a863 WatchSource:0}: Error finding container cfaf89a2969de096994cddd0555263d037fee8a7bf136c9d71487a28ab94a863: Status 404 returned error can't find the container with id cfaf89a2969de096994cddd0555263d037fee8a7bf136c9d71487a28ab94a863 Mar 10 00:09:57 crc kubenswrapper[4994]: W0310 00:09:57.333040 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-8abc1f17c56089b934df9306ad8ce987a405b1b8f04741ab0124507546ed2e63 WatchSource:0}: Error finding container 8abc1f17c56089b934df9306ad8ce987a405b1b8f04741ab0124507546ed2e63: Status 404 returned error can't find the container with id 8abc1f17c56089b934df9306ad8ce987a405b1b8f04741ab0124507546ed2e63 Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.603506 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.604080 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.104058897 +0000 UTC m=+212.277765677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.705224 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.705423 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.205390874 +0000 UTC m=+212.379097613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.706099 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.706981 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.206961564 +0000 UTC m=+212.380668353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.773097 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.775019 4994 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mzg85 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.775094 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" podUID="4c9cbda0-655c-4cf9-8f9a-23b3ebf37339" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.775749 4994 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mzg85 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.775803 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" podUID="4c9cbda0-655c-4cf9-8f9a-23b3ebf37339" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.775857 4994 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mzg85 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.775890 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" podUID="4c9cbda0-655c-4cf9-8f9a-23b3ebf37339" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.807336 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.807572 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.307531622 +0000 UTC m=+212.481238371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.807716 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.808329 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.308310652 +0000 UTC m=+212.482017401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.852700 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.852764 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.909724 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.910213 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.410189893 +0000 UTC m=+212.583896642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.010475 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.011338 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.511313065 +0000 UTC m=+212.685019814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.111475 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.111766 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.61174849 +0000 UTC m=+212.785455239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.190672 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3f33b6c56cfac1967a4dcbddceca032158c67699506b06995e597ac4b64027e4"} Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.192755 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0f13ea4e720f3258d8b7a44cbb5496b8925efb6c962c67c6dccfa890d7fb97d6"} Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.192844 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cfaf89a2969de096994cddd0555263d037fee8a7bf136c9d71487a28ab94a863"} Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.193830 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" event={"ID":"f4c125b3-4a9c-46a7-a468-54e93c44751d","Type":"ContainerStarted","Data":"683f739c9fae78029d3e208484563aaa46773767208f7226c53323f5c7fc2207"} Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.195241 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"eebe108915a33f04c0847174595c1bed992e476de8a176720fc247e2ae933044"} Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.195276 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8abc1f17c56089b934df9306ad8ce987a405b1b8f04741ab0124507546ed2e63"} Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.218538 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.219818 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.719801175 +0000 UTC m=+212.893508124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.319465 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.321182 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.819714517 +0000 UTC m=+212.993421266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.321269 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.321652 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.821630784 +0000 UTC m=+212.995337533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.422354 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.422543 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.92251644 +0000 UTC m=+213.096223189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.422601 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.422921 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.922909391 +0000 UTC m=+213.096616140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.526169 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.526517 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.026472603 +0000 UTC m=+213.200179352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.526693 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.527043 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.027027187 +0000 UTC m=+213.200733936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.627651 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.628061 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.128000555 +0000 UTC m=+213.301707304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.628213 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.628668 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.128657392 +0000 UTC m=+213.302364141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.729170 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.729365 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.229333592 +0000 UTC m=+213.403040351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.729446 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.729714 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.229702181 +0000 UTC m=+213.403408930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.830666 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.830849 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.330819973 +0000 UTC m=+213.504526722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.831025 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.831358 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.331346246 +0000 UTC m=+213.505052995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.866525 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:58 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:09:58 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:09:58 crc kubenswrapper[4994]: healthz check failed Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.866599 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.932682 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.933067 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.433031463 +0000 UTC m=+213.606738212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.034926 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.035327 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.535313283 +0000 UTC m=+213.709020032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.130610 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59690: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.136214 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.136557 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.636542108 +0000 UTC m=+213.810248857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.194817 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59702: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.200749 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" event={"ID":"f4c125b3-4a9c-46a7-a468-54e93c44751d","Type":"ContainerStarted","Data":"ea7e1f47de27d1f230c1087175c3b1b836fe0f70fdf5858eac2cae561d8c7863"} Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.220224 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59706: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.237647 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.238401 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.738388898 +0000 UTC m=+213.912095647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.297528 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59714: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.321198 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.322011 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.333073 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.339146 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.339597 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3d610d6-85f4-43b2-a597-4955431daa70-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.339628 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.339659 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3d610d6-85f4-43b2-a597-4955431daa70-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.339860 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.839780557 +0000 UTC m=+214.013487316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.350488 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.399142 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59730: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.416520 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59732: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.441040 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3d610d6-85f4-43b2-a597-4955431daa70-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.441077 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.441113 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3d610d6-85f4-43b2-a597-4955431daa70-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.441465 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3d610d6-85f4-43b2-a597-4955431daa70-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.441528 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.941511884 +0000 UTC m=+214.115218633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.461780 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3d610d6-85f4-43b2-a597-4955431daa70-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.541901 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.542097 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.042077682 +0000 UTC m=+214.215784431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.542290 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.542680 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.042669746 +0000 UTC m=+214.216376495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.562108 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59734: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.643737 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.644205 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.144174379 +0000 UTC m=+214.317881168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.650675 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.747961 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.748844 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.248833079 +0000 UTC m=+214.422539828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.850071 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.850340 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.350304159 +0000 UTC m=+214.524010908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.850399 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.850934 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.350916904 +0000 UTC m=+214.524623643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.856537 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:59 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:09:59 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:09:59 crc kubenswrapper[4994]: healthz check failed Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.856611 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.858320 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 00:09:59 crc kubenswrapper[4994]: W0310 00:09:59.871077 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc3d610d6_85f4_43b2_a597_4955431daa70.slice/crio-31dddbc250d328a5341968e6887f3a959acf990551da79bebde9d9af83c538be WatchSource:0}: Error finding container 31dddbc250d328a5341968e6887f3a959acf990551da79bebde9d9af83c538be: Status 404 returned error can't find the container with id 31dddbc250d328a5341968e6887f3a959acf990551da79bebde9d9af83c538be Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.940763 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59742: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.951917 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.952074 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.452048496 +0000 UTC m=+214.625755245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.952178 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.952499 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.452489518 +0000 UTC m=+214.626196267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.052863 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.052976 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.552955003 +0000 UTC m=+214.726661752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.053152 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.053471 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.553448856 +0000 UTC m=+214.727169325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.129275 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551690-7rbl8"] Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.130268 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.135609 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551690-7rbl8"] Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.154717 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.154921 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.654904226 +0000 UTC m=+214.828610975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.154959 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97ch2\" (UniqueName: \"kubernetes.io/projected/f04aae5d-b067-4e49-82f3-66412ec1bba6-kube-api-access-97ch2\") pod \"auto-csr-approver-29551690-7rbl8\" (UID: \"f04aae5d-b067-4e49-82f3-66412ec1bba6\") " pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.155016 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.155247 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.655241024 +0000 UTC m=+214.828947773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.210606 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c3d610d6-85f4-43b2-a597-4955431daa70","Type":"ContainerStarted","Data":"31dddbc250d328a5341968e6887f3a959acf990551da79bebde9d9af83c538be"} Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.228195 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vxjt2" podStartSLOduration=169.22817835 podStartE2EDuration="2m49.22817835s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:00.226734615 +0000 UTC m=+214.400441364" watchObservedRunningTime="2026-03-10 00:10:00.22817835 +0000 UTC m=+214.401885099" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.256338 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.256545 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97ch2\" (UniqueName: \"kubernetes.io/projected/f04aae5d-b067-4e49-82f3-66412ec1bba6-kube-api-access-97ch2\") pod \"auto-csr-approver-29551690-7rbl8\" (UID: \"f04aae5d-b067-4e49-82f3-66412ec1bba6\") " pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.257331 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.75731723 +0000 UTC m=+214.931023979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.292295 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97ch2\" (UniqueName: \"kubernetes.io/projected/f04aae5d-b067-4e49-82f3-66412ec1bba6-kube-api-access-97ch2\") pod \"auto-csr-approver-29551690-7rbl8\" (UID: \"f04aae5d-b067-4e49-82f3-66412ec1bba6\") " pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.358413 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.358921 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.858854613 +0000 UTC m=+215.032561362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.442051 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.459279 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.459410 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.95938125 +0000 UTC m=+215.133087999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.459582 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.459819 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.95981241 +0000 UTC m=+215.133519159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.562456 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.562998 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.062972614 +0000 UTC m=+215.236679393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.629196 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59746: no serving certificate available for the kubelet" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.664663 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.664953 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.164942656 +0000 UTC m=+215.338649405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.680505 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551690-7rbl8"] Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.765517 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.765970 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.265950455 +0000 UTC m=+215.439657204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.768536 4994 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mzg85 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.768585 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" podUID="4c9cbda0-655c-4cf9-8f9a-23b3ebf37339" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.768627 4994 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mzg85 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.768669 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" podUID="4c9cbda0-655c-4cf9-8f9a-23b3ebf37339" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.854024 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:00 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:00 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:00 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.854098 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.867610 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.868073 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.368053391 +0000 UTC m=+215.541760140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.968474 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.968684 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.46865375 +0000 UTC m=+215.642360509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.968744 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.969117 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.469104132 +0000 UTC m=+215.642810881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.973241 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.973843 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.976728 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.976954 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.984282 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.069821 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.070067 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a178b03-e81c-47af-898a-0463f964e327-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.070179 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a178b03-e81c-47af-898a-0463f964e327-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.070329 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.570310696 +0000 UTC m=+215.744017455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.171180 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a178b03-e81c-47af-898a-0463f964e327-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.171270 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a178b03-e81c-47af-898a-0463f964e327-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.171296 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.171412 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a178b03-e81c-47af-898a-0463f964e327-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.171553 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.671542311 +0000 UTC m=+215.845249060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.217510 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" event={"ID":"7fd7640d-700a-420e-b15f-7f681090727b","Type":"ContainerStarted","Data":"e23e4a004f421713fe5580c3caf54896d2c54a646b5ca0dd5f920ecb04055cc5"} Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.220643 4994 generic.go:334] "Generic (PLEG): container finished" podID="0f2b43c0-a96d-4ea3-8d46-d6919aedf741" containerID="f1e1d34ce7a507537fd76dd423d8de4db96972c855db85d1fc738caf85d29e98" exitCode=0 Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.220709 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" event={"ID":"0f2b43c0-a96d-4ea3-8d46-d6919aedf741","Type":"ContainerDied","Data":"f1e1d34ce7a507537fd76dd423d8de4db96972c855db85d1fc738caf85d29e98"} Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.222634 4994 generic.go:334] "Generic (PLEG): container finished" podID="c3d610d6-85f4-43b2-a597-4955431daa70" containerID="c633aedd351850595d229a68c4652520ab947d1ecb51a46d4cd387b12bdf57bf" exitCode=0 Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.222679 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c3d610d6-85f4-43b2-a597-4955431daa70","Type":"ContainerDied","Data":"c633aedd351850595d229a68c4652520ab947d1ecb51a46d4cd387b12bdf57bf"} Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.272122 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.272294 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.772265953 +0000 UTC m=+215.945972702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.272414 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.272695 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.772688823 +0000 UTC m=+215.946395572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.283341 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a178b03-e81c-47af-898a-0463f964e327-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.288063 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.373685 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.373905 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.873863236 +0000 UTC m=+216.047569985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.374299 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.374626 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.874617815 +0000 UTC m=+216.048324564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.457435 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xm7j6"] Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.457670 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" containerID="cri-o://ce0e0fe899887c51b11ad46c4b9de66da2a57ba53ab4c60cd515a10d07afe0f7" gracePeriod=30 Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.458326 4994 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xm7j6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.458415 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.475546 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.475689 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.975667265 +0000 UTC m=+216.149374014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.475796 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.476084 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.976070535 +0000 UTC m=+216.149777284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.505336 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb"] Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.505554 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" containerID="cri-o://0f1267926bcca137db3abcb72a6c709ffad1b8249211d418fa61e8ee79ffda76" gracePeriod=30 Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.508978 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.577546 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.577917 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.077901465 +0000 UTC m=+216.251608214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.660067 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.679587 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.680544 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.180531154 +0000 UTC m=+216.354238013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.740205 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.747144 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.747190 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.748505 4994 patch_prober.go:28] interesting pod/apiserver-76f77b778f-lxxqb container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.19:8443/livez\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.748557 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" podUID="ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.19:8443/livez\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.780977 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.781198 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.281172084 +0000 UTC m=+216.454878833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.781297 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.781587 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.281579785 +0000 UTC m=+216.455286534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.872961 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:01 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:01 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:01 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.873034 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.879821 4994 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bkhqb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.879892 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.883328 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.883642 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.38362762 +0000 UTC m=+216.557334369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.911369 4994 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xm7j6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.911425 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.971478 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59758: no serving certificate available for the kubelet" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.984388 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.984746 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.484730831 +0000 UTC m=+216.658437580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.064561 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.085935 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.086116 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.586090599 +0000 UTC m=+216.759797348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.086361 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.086741 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.586728555 +0000 UTC m=+216.760435304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.161385 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.185758 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.185808 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.188969 4994 patch_prober.go:28] interesting pod/console-f9d7485db-rlqtz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.189029 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rlqtz" podUID="11b78073-cc4a-4a6f-89ab-631fde4b3371" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.189226 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.189346 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.689325574 +0000 UTC m=+216.863032333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.189533 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.190240 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.690229586 +0000 UTC m=+216.863936335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.230672 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.250043 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-879f6c89f-xm7j6_51cdf794-a18c-4a6f-a3ef-a07f03ce95a8/controller-manager/0.log" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.250095 4994 generic.go:334] "Generic (PLEG): container finished" podID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerID="ce0e0fe899887c51b11ad46c4b9de66da2a57ba53ab4c60cd515a10d07afe0f7" exitCode=2 Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.250205 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" event={"ID":"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8","Type":"ContainerDied","Data":"ce0e0fe899887c51b11ad46c4b9de66da2a57ba53ab4c60cd515a10d07afe0f7"} Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.295424 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.295591 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.795567184 +0000 UTC m=+216.969273933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.295906 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.297152 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.797141914 +0000 UTC m=+216.970848753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.306406 4994 generic.go:334] "Generic (PLEG): container finished" podID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerID="0f1267926bcca137db3abcb72a6c709ffad1b8249211d418fa61e8ee79ffda76" exitCode=0 Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.306653 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" event={"ID":"54ca6ee4-24c4-415f-a1b6-26f54e2992f8","Type":"ContainerDied","Data":"0f1267926bcca137db3abcb72a6c709ffad1b8249211d418fa61e8ee79ffda76"} Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.349464 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.397433 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.398324 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.898300997 +0000 UTC m=+217.072007796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.442944 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bwzk5"] Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.444094 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.446707 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.498766 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-utilities\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.498800 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-catalog-content\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.498829 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l95st\" (UniqueName: \"kubernetes.io/projected/fdad0261-804d-41dc-8a25-48018f136c0f-kube-api-access-l95st\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.498906 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.499154 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.999142361 +0000 UTC m=+217.172849110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.499676 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.499707 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.499825 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.499885 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.505187 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwzk5"] Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.600241 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.600464 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-utilities\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.600493 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-catalog-content\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.600523 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l95st\" (UniqueName: \"kubernetes.io/projected/fdad0261-804d-41dc-8a25-48018f136c0f-kube-api-access-l95st\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.600646 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.100615502 +0000 UTC m=+217.274322251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.601450 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-catalog-content\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.601541 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-utilities\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.627622 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l95st\" (UniqueName: \"kubernetes.io/projected/fdad0261-804d-41dc-8a25-48018f136c0f-kube-api-access-l95st\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.639547 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.644143 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zv2kt"] Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.645065 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.650366 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.660351 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.672289 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zv2kt"] Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.702144 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-utilities\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.702193 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.702239 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-catalog-content\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.702262 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lx2s\" (UniqueName: \"kubernetes.io/projected/76aa065c-ed60-4237-b36f-5ce2865256ff-kube-api-access-5lx2s\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.703239 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.203228182 +0000 UTC m=+217.376934921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.757257 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.803473 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.803594 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.303574623 +0000 UTC m=+217.477281372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.803978 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-utilities\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.804418 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-utilities\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.804493 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.804790 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.304780724 +0000 UTC m=+217.478487473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.804999 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-catalog-content\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.805296 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-catalog-content\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.805347 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lx2s\" (UniqueName: \"kubernetes.io/projected/76aa065c-ed60-4237-b36f-5ce2865256ff-kube-api-access-5lx2s\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.837740 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c4tz9"] Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.839016 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.842716 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lx2s\" (UniqueName: \"kubernetes.io/projected/76aa065c-ed60-4237-b36f-5ce2865256ff-kube-api-access-5lx2s\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.843048 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c4tz9"] Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.851915 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.854686 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:02 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:02 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:02 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.854725 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.908316 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.908581 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-catalog-content\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.908691 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfh8s\" (UniqueName: \"kubernetes.io/projected/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-kube-api-access-xfh8s\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.908728 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-utilities\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.909496 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.409481725 +0000 UTC m=+217.583188474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.998396 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.010212 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-catalog-content\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.010272 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.010378 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfh8s\" (UniqueName: \"kubernetes.io/projected/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-kube-api-access-xfh8s\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.010418 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-utilities\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.010709 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-catalog-content\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.010961 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-utilities\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.011010 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.510994917 +0000 UTC m=+217.684701736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.031793 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfh8s\" (UniqueName: \"kubernetes.io/projected/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-kube-api-access-xfh8s\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.040037 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s7qcn"] Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.041542 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.062361 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s7qcn"] Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.111388 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.111571 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.611546364 +0000 UTC m=+217.785253113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.111614 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-catalog-content\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.111689 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.111842 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-utilities\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.111969 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fbrm\" (UniqueName: \"kubernetes.io/projected/abe30cce-8379-4db8-838b-f48b4bc96621-kube-api-access-2fbrm\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.112538 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.612523849 +0000 UTC m=+217.786230598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.169453 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.213313 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.213443 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.713418336 +0000 UTC m=+217.887125085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.213663 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-catalog-content\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.213938 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.213987 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-utilities\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.214026 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fbrm\" (UniqueName: \"kubernetes.io/projected/abe30cce-8379-4db8-838b-f48b4bc96621-kube-api-access-2fbrm\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.214121 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-catalog-content\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.214929 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-utilities\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.215811 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.715511787 +0000 UTC m=+217.889218566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.236175 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fbrm\" (UniqueName: \"kubernetes.io/projected/abe30cce-8379-4db8-838b-f48b4bc96621-kube-api-access-2fbrm\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.315035 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.315618 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.815586903 +0000 UTC m=+217.989293702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.370254 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.417418 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.417764 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.917750542 +0000 UTC m=+218.091457291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.519268 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.519407 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.019381016 +0000 UTC m=+218.193087785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.521947 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.522436 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.022418972 +0000 UTC m=+218.196125721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.623395 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.623530 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.123504183 +0000 UTC m=+218.297210922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.623736 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.624409 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.124363274 +0000 UTC m=+218.298070033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.725426 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.725625 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.225598499 +0000 UTC m=+218.399305248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.725820 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.726143 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.226134893 +0000 UTC m=+218.399841642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.773584 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.827106 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.827332 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.327303766 +0000 UTC m=+218.501010525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.827452 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.827752 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.327740137 +0000 UTC m=+218.501446886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.855404 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:03 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:03 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:03 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.855466 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.928618 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.928904 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.428889969 +0000 UTC m=+218.602596718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.030694 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.031247 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.531221822 +0000 UTC m=+218.704928571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.131392 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.131674 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.631621905 +0000 UTC m=+218.805328654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.166185 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wv4d4" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.233374 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.235451 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.735432134 +0000 UTC m=+218.909138903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.334313 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.334529 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.834501145 +0000 UTC m=+219.008207894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.334636 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.335097 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.835078039 +0000 UTC m=+219.008784798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.435819 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.436018 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.935982036 +0000 UTC m=+219.109688825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.436072 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.436357 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.936344805 +0000 UTC m=+219.110051554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.537292 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.537501 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.037471387 +0000 UTC m=+219.211178146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.537567 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.537983 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.037971949 +0000 UTC m=+219.211678708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.568303 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59762: no serving certificate available for the kubelet" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.636482 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hrh9x"] Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.637505 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.638323 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.638494 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.138464315 +0000 UTC m=+219.312171064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.638666 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.639033 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.13901525 +0000 UTC m=+219.312721999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.643193 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.651058 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrh9x"] Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.739298 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.739479 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.239441293 +0000 UTC m=+219.413148042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.739680 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.739717 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btt9c\" (UniqueName: \"kubernetes.io/projected/a4a4dc2d-502f-4c05-ab76-1cc708f13006-kube-api-access-btt9c\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.739751 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-catalog-content\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.739828 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-utilities\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.740206 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.240195933 +0000 UTC m=+219.413902862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.841345 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.841560 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.34152979 +0000 UTC m=+219.515236539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.841626 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-catalog-content\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.841713 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-utilities\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.842006 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.842076 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btt9c\" (UniqueName: \"kubernetes.io/projected/a4a4dc2d-502f-4c05-ab76-1cc708f13006-kube-api-access-btt9c\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.842143 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-catalog-content\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.842715 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.342698739 +0000 UTC m=+219.516405578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.843152 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-utilities\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.854340 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:04 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:04 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:04 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.854421 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.878171 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btt9c\" (UniqueName: \"kubernetes.io/projected/a4a4dc2d-502f-4c05-ab76-1cc708f13006-kube-api-access-btt9c\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.893846 4994 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.943201 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.943429 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.443414681 +0000 UTC m=+219.617121430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.000661 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.045630 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bzrd2"] Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.046373 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.046937 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.546917512 +0000 UTC m=+219.720624271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.048573 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.061355 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzrd2"] Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.149240 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.149414 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.649387348 +0000 UTC m=+219.823094097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.150540 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.150758 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdkv4\" (UniqueName: \"kubernetes.io/projected/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-kube-api-access-fdkv4\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.150858 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.650850685 +0000 UTC m=+219.824557434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.151269 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-utilities\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.151468 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-catalog-content\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.252598 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.252743 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.752724796 +0000 UTC m=+219.926431545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.252835 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-utilities\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.252886 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-catalog-content\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.252954 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.252980 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdkv4\" (UniqueName: \"kubernetes.io/projected/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-kube-api-access-fdkv4\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.253307 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.75329062 +0000 UTC m=+219.926997369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.253500 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-catalog-content\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.254232 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-utilities\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.268866 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdkv4\" (UniqueName: \"kubernetes.io/projected/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-kube-api-access-fdkv4\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.354177 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.354339 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.854309869 +0000 UTC m=+220.028016628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.356540 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.356850 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.856838803 +0000 UTC m=+220.030545572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.380890 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.457939 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.458076 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.958058047 +0000 UTC m=+220.131764796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.458937 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.459240 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.959230826 +0000 UTC m=+220.132937565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.559941 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.560078 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.06003464 +0000 UTC m=+220.233741389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.560169 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.560576 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.060560824 +0000 UTC m=+220.234267573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.633183 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wpd8k"] Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.634824 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.636812 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.648529 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wpd8k"] Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.661073 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.662044 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.162020954 +0000 UTC m=+220.335727703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.763205 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-utilities\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.763293 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.763347 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-catalog-content\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.763448 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpl2h\" (UniqueName: \"kubernetes.io/projected/6525b40b-1c23-4533-a025-4d86bc406f00-kube-api-access-xpl2h\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.764233 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.264218072 +0000 UTC m=+220.437924911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.854226 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:05 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:05 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:05 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.854298 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.866547 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.870443 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.370407551 +0000 UTC m=+220.544114320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.871567 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-utilities\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.871642 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.871676 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-catalog-content\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.871732 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpl2h\" (UniqueName: \"kubernetes.io/projected/6525b40b-1c23-4533-a025-4d86bc406f00-kube-api-access-xpl2h\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.872150 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-utilities\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.872338 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.372304639 +0000 UTC m=+220.546011428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.872387 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-catalog-content\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.898123 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpl2h\" (UniqueName: \"kubernetes.io/projected/6525b40b-1c23-4533-a025-4d86bc406f00-kube-api-access-xpl2h\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.953390 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.972686 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.973341 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.473266327 +0000 UTC m=+220.646973076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.032654 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t5kj4"] Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.033713 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.045762 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5kj4"] Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.074848 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-catalog-content\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.075105 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.075153 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-utilities\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.075185 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frwzt\" (UniqueName: \"kubernetes.io/projected/0429fae4-1356-4d61-86a3-267f74f27636-kube-api-access-frwzt\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.075449 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.575438664 +0000 UTC m=+220.749145413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.117192 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59750: no serving certificate available for the kubelet" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.176084 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.176322 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.676286649 +0000 UTC m=+220.849993478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.176395 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-catalog-content\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.176531 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.176701 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-utilities\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.176814 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frwzt\" (UniqueName: \"kubernetes.io/projected/0429fae4-1356-4d61-86a3-267f74f27636-kube-api-access-frwzt\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.176832 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-catalog-content\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.176964 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.676951127 +0000 UTC m=+220.850657886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.177129 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-utilities\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.197647 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frwzt\" (UniqueName: \"kubernetes.io/projected/0429fae4-1356-4d61-86a3-267f74f27636-kube-api-access-frwzt\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.277823 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.278155 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.778098609 +0000 UTC m=+220.951805368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.278270 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.279032 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.779020692 +0000 UTC m=+220.952727441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.355675 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.380208 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.381131 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.881098278 +0000 UTC m=+221.054805067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.481689 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.482046 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.982034755 +0000 UTC m=+221.155741504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.588752 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.589371 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.089348432 +0000 UTC m=+221.263055191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.690092 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.690443 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.190427703 +0000 UTC m=+221.364134452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.737844 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.751842 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.764973 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.792078 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.792324 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.292302983 +0000 UTC m=+221.466009732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.792360 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.792629 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.292622541 +0000 UTC m=+221.466329290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.845372 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.855890 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.856054 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:06 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:06 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:06 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.856081 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.898742 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx7fb\" (UniqueName: \"kubernetes.io/projected/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-kube-api-access-fx7fb\") pod \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.898831 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-secret-volume\") pod \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.898923 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-config-volume\") pod \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.899058 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.908241 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-config-volume" (OuterVolumeSpecName: "config-volume") pod "0f2b43c0-a96d-4ea3-8d46-d6919aedf741" (UID: "0f2b43c0-a96d-4ea3-8d46-d6919aedf741"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.908343 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.408328209 +0000 UTC m=+221.582034958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.910184 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-kube-api-access-fx7fb" (OuterVolumeSpecName: "kube-api-access-fx7fb") pod "0f2b43c0-a96d-4ea3-8d46-d6919aedf741" (UID: "0f2b43c0-a96d-4ea3-8d46-d6919aedf741"). InnerVolumeSpecName "kube-api-access-fx7fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.919501 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0f2b43c0-a96d-4ea3-8d46-d6919aedf741" (UID: "0f2b43c0-a96d-4ea3-8d46-d6919aedf741"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.000563 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3d610d6-85f4-43b2-a597-4955431daa70-kubelet-dir\") pod \"c3d610d6-85f4-43b2-a597-4955431daa70\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.000686 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3d610d6-85f4-43b2-a597-4955431daa70-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c3d610d6-85f4-43b2-a597-4955431daa70" (UID: "c3d610d6-85f4-43b2-a597-4955431daa70"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.000713 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3d610d6-85f4-43b2-a597-4955431daa70-kube-api-access\") pod \"c3d610d6-85f4-43b2-a597-4955431daa70\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.001144 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.001230 4994 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3d610d6-85f4-43b2-a597-4955431daa70-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.001246 4994 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.001258 4994 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.001272 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx7fb\" (UniqueName: \"kubernetes.io/projected/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-kube-api-access-fx7fb\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.001512 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.501500362 +0000 UTC m=+221.675207111 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.006053 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d610d6-85f4-43b2-a597-4955431daa70-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c3d610d6-85f4-43b2-a597-4955431daa70" (UID: "c3d610d6-85f4-43b2-a597-4955431daa70"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.101952 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.102160 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.602134061 +0000 UTC m=+221.775840810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.102518 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.102582 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3d610d6-85f4-43b2-a597-4955431daa70-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.102963 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.602945981 +0000 UTC m=+221.776652730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.203218 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.203424 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.703373686 +0000 UTC m=+221.877080435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.203464 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.203968 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.703957351 +0000 UTC m=+221.877664100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.304996 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.305187 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.805155764 +0000 UTC m=+221.978862513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.305349 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.305790 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.805695518 +0000 UTC m=+221.979402267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.334696 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c3d610d6-85f4-43b2-a597-4955431daa70","Type":"ContainerDied","Data":"31dddbc250d328a5341968e6887f3a959acf990551da79bebde9d9af83c538be"} Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.334741 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31dddbc250d328a5341968e6887f3a959acf990551da79bebde9d9af83c538be" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.334809 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.338402 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" event={"ID":"f04aae5d-b067-4e49-82f3-66412ec1bba6","Type":"ContainerStarted","Data":"4311de40639f67ea8a55d45863ea9b8bade3cfa62b21c735dd8d526f7c5e805a"} Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.340094 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" event={"ID":"0f2b43c0-a96d-4ea3-8d46-d6919aedf741","Type":"ContainerDied","Data":"bae537ffa0b0b80c06d0b79407ab1e4733786fdf150560f197e315922a963b90"} Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.340130 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bae537ffa0b0b80c06d0b79407ab1e4733786fdf150560f197e315922a963b90" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.340201 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.407266 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.407544 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.907508697 +0000 UTC m=+222.081215476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.407745 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.408418 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.90840581 +0000 UTC m=+222.082112549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.508466 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.508671 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.00864311 +0000 UTC m=+222.182349889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.508982 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.509370 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.009349947 +0000 UTC m=+222.183056706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.611322 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.611522 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.111493555 +0000 UTC m=+222.285200314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.611654 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.612537 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.11250968 +0000 UTC m=+222.286216469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.712859 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.713205 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.21316497 +0000 UTC m=+222.386871759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.713252 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.713726 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.213709524 +0000 UTC m=+222.387416313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.814625 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.814774 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.314748934 +0000 UTC m=+222.488455683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.815025 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.815316 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.315307617 +0000 UTC m=+222.489014366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.864283 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:07 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:07 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:07 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.864356 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.891605 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.893963 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-879f6c89f-xm7j6_51cdf794-a18c-4a6f-a3ef-a07f03ce95a8/controller-manager/0.log" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.894013 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.916419 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.916568 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.416540083 +0000 UTC m=+222.590246832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.916737 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.917065 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.417052516 +0000 UTC m=+222.590759265 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.018960 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxm5s\" (UniqueName: \"kubernetes.io/projected/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-kube-api-access-rxm5s\") pod \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019008 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-serving-cert\") pod \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019024 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thbrs\" (UniqueName: \"kubernetes.io/projected/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-kube-api-access-thbrs\") pod \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019048 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-proxy-ca-bundles\") pod \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019237 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019265 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-client-ca\") pod \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019282 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-client-ca\") pod \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019310 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-serving-cert\") pod \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019329 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-config\") pod \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019356 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-config\") pod \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.020374 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-config" (OuterVolumeSpecName: "config") pod "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" (UID: "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.020450 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.520438094 +0000 UTC m=+222.694144843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.022291 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-config" (OuterVolumeSpecName: "config") pod "54ca6ee4-24c4-415f-a1b6-26f54e2992f8" (UID: "54ca6ee4-24c4-415f-a1b6-26f54e2992f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.021970 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-client-ca" (OuterVolumeSpecName: "client-ca") pod "54ca6ee4-24c4-415f-a1b6-26f54e2992f8" (UID: "54ca6ee4-24c4-415f-a1b6-26f54e2992f8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.024981 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-kube-api-access-rxm5s" (OuterVolumeSpecName: "kube-api-access-rxm5s") pod "54ca6ee4-24c4-415f-a1b6-26f54e2992f8" (UID: "54ca6ee4-24c4-415f-a1b6-26f54e2992f8"). InnerVolumeSpecName "kube-api-access-rxm5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.025282 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" (UID: "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.025575 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-client-ca" (OuterVolumeSpecName: "client-ca") pod "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" (UID: "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.026843 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" (UID: "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.027138 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "54ca6ee4-24c4-415f-a1b6-26f54e2992f8" (UID: "54ca6ee4-24c4-415f-a1b6-26f54e2992f8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.048899 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-kube-api-access-thbrs" (OuterVolumeSpecName: "kube-api-access-thbrs") pod "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" (UID: "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8"). InnerVolumeSpecName "kube-api-access-thbrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121351 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121506 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121523 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121536 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121547 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxm5s\" (UniqueName: \"kubernetes.io/projected/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-kube-api-access-rxm5s\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121558 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121571 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thbrs\" (UniqueName: \"kubernetes.io/projected/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-kube-api-access-thbrs\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121582 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121593 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121603 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.122054 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.622038178 +0000 UTC m=+222.795744927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.224067 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.235579 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.735548719 +0000 UTC m=+222.909255468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.327397 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.327695 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.827684186 +0000 UTC m=+223.001390935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.346154 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" event={"ID":"54ca6ee4-24c4-415f-a1b6-26f54e2992f8","Type":"ContainerDied","Data":"68e7ebb7b9a0fa967b84de70c209836439efd368a03ea8c0304dd46c8d9878be"} Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.346202 4994 scope.go:117] "RemoveContainer" containerID="0f1267926bcca137db3abcb72a6c709ffad1b8249211d418fa61e8ee79ffda76" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.346323 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.353334 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-879f6c89f-xm7j6_51cdf794-a18c-4a6f-a3ef-a07f03ce95a8/controller-manager/0.log" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.353390 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" event={"ID":"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8","Type":"ContainerDied","Data":"e29a59b07d62557e79f131725545f7bbc14a1ca6dcc0ac4661855d156c889001"} Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.353459 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.372222 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb"] Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.385754 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb"] Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.395061 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xm7j6"] Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.399145 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xm7j6"] Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.428920 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.429456 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.929436785 +0000 UTC m=+223.103143534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.530861 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.531239 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.031216893 +0000 UTC m=+223.204923652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.565281 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" path="/var/lib/kubelet/pods/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8/volumes" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.566004 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" path="/var/lib/kubelet/pods/54ca6ee4-24c4-415f-a1b6-26f54e2992f8/volumes" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.631692 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.632057 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.132014857 +0000 UTC m=+223.305721646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.733774 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.734113 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.234098612 +0000 UTC m=+223.407805351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.835012 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.835173 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.335153232 +0000 UTC m=+223.508860001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.835212 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.835514 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.335506512 +0000 UTC m=+223.509213261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.854546 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:08 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:08 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:08 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.854622 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.936122 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.936272 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.436248264 +0000 UTC m=+223.609955013 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.936489 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.936750 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.436739326 +0000 UTC m=+223.610446075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.042937 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.043238 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.543200182 +0000 UTC m=+223.716906971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.043286 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.043747 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.543734645 +0000 UTC m=+223.717441394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.144219 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.144428 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.644395846 +0000 UTC m=+223.818102595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.144639 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.144972 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.6449585 +0000 UTC m=+223.818665259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.246008 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.246282 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.746255476 +0000 UTC m=+223.919962225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.246350 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.246640 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.746627725 +0000 UTC m=+223.920334474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.347847 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.348054 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.848028405 +0000 UTC m=+224.021735154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.348111 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.348393 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.848381333 +0000 UTC m=+224.022088072 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.449817 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.450594 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.950569001 +0000 UTC m=+224.124275780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.465129 4994 scope.go:117] "RemoveContainer" containerID="ce0e0fe899887c51b11ad46c4b9de66da2a57ba53ab4c60cd515a10d07afe0f7" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.552517 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.553003 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.052987446 +0000 UTC m=+224.226694195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.633708 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg"] Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.634033 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634052 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.634064 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2b43c0-a96d-4ea3-8d46-d6919aedf741" containerName="collect-profiles" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634070 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2b43c0-a96d-4ea3-8d46-d6919aedf741" containerName="collect-profiles" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.634077 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d610d6-85f4-43b2-a597-4955431daa70" containerName="pruner" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634083 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d610d6-85f4-43b2-a597-4955431daa70" containerName="pruner" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.634106 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634112 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634215 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3d610d6-85f4-43b2-a597-4955431daa70" containerName="pruner" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634234 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634249 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634262 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f2b43c0-a96d-4ea3-8d46-d6919aedf741" containerName="collect-profiles" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634678 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.637963 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.638216 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.638381 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.638499 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.638604 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.640112 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.642529 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c67ff489b-x49rf"] Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.643228 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.644215 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.644458 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.645167 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.645528 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.645710 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.645836 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.650033 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.653811 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.654121 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.154106588 +0000 UTC m=+224.327813337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.660382 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c67ff489b-x49rf"] Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.666436 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg"] Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.716898 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59752: no serving certificate available for the kubelet" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.755736 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-proxy-ca-bundles\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.755804 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.755831 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ce0bbc-ee87-47f6-be5d-24f40386cb60-serving-cert\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.755887 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-config\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.755911 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-client-ca\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.755966 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa106de9-72a4-4364-a10d-2ec2c543afcf-serving-cert\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.756132 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vb2m\" (UniqueName: \"kubernetes.io/projected/51ce0bbc-ee87-47f6-be5d-24f40386cb60-kube-api-access-6vb2m\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.756148 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-client-ca\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.756162 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-config\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.756223 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7jgf\" (UniqueName: \"kubernetes.io/projected/aa106de9-72a4-4364-a10d-2ec2c543afcf-kube-api-access-c7jgf\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.756564 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.256552832 +0000 UTC m=+224.430259581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860179 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860662 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-client-ca\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860724 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vb2m\" (UniqueName: \"kubernetes.io/projected/51ce0bbc-ee87-47f6-be5d-24f40386cb60-kube-api-access-6vb2m\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860755 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa106de9-72a4-4364-a10d-2ec2c543afcf-serving-cert\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860785 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-client-ca\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860810 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-config\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860865 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7jgf\" (UniqueName: \"kubernetes.io/projected/aa106de9-72a4-4364-a10d-2ec2c543afcf-kube-api-access-c7jgf\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860923 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-proxy-ca-bundles\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860974 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ce0bbc-ee87-47f6-be5d-24f40386cb60-serving-cert\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.861005 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-config\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.861337 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:09 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:09 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:09 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.861455 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.862357 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-config\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.863081 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.363052079 +0000 UTC m=+224.536758828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.863739 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-config\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.867487 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-client-ca\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.873760 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-client-ca\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.884440 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa106de9-72a4-4364-a10d-2ec2c543afcf-serving-cert\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.884555 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ce0bbc-ee87-47f6-be5d-24f40386cb60-serving-cert\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.895116 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-proxy-ca-bundles\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.895613 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vb2m\" (UniqueName: \"kubernetes.io/projected/51ce0bbc-ee87-47f6-be5d-24f40386cb60-kube-api-access-6vb2m\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.905110 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7jgf\" (UniqueName: \"kubernetes.io/projected/aa106de9-72a4-4364-a10d-2ec2c543afcf-kube-api-access-c7jgf\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.962089 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.962502 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.462480989 +0000 UTC m=+224.636187738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.011264 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.016244 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.019510 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrh9x"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.040901 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c4tz9"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.051080 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5kj4"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.053595 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzrd2"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.063491 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.064020 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.56399942 +0000 UTC m=+224.737706189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.076179 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s7qcn"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.164996 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.165342 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.665324947 +0000 UTC m=+224.839031686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.257693 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.264999 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zv2kt"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.265513 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.265672 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.76565125 +0000 UTC m=+224.939357999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.265850 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.266142 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.766132691 +0000 UTC m=+224.939839440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.278018 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwzk5"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.281545 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wpd8k"] Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.340783 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4a4dc2d_502f_4c05_ab76_1cc708f13006.slice/crio-5863f96f41db6fc401ecb7000e3f5a6cfef96961acdd0f3f461004c58668116e WatchSource:0}: Error finding container 5863f96f41db6fc401ecb7000e3f5a6cfef96961acdd0f3f461004c58668116e: Status 404 returned error can't find the container with id 5863f96f41db6fc401ecb7000e3f5a6cfef96961acdd0f3f461004c58668116e Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.341885 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab6cd76f_6272_4fcd_8c75_3040c45ef1b5.slice/crio-7734e5951b02bb3a0a46ea5a16ee396269ee4c95d8725567063c864699e319c0 WatchSource:0}: Error finding container 7734e5951b02bb3a0a46ea5a16ee396269ee4c95d8725567063c864699e319c0: Status 404 returned error can't find the container with id 7734e5951b02bb3a0a46ea5a16ee396269ee4c95d8725567063c864699e319c0 Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.342621 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0429fae4_1356_4d61_86a3_267f74f27636.slice/crio-c31a76fcba6e0a2edf574624c6292a3f51560169f1f1fa309a2ea336e40231d4 WatchSource:0}: Error finding container c31a76fcba6e0a2edf574624c6292a3f51560169f1f1fa309a2ea336e40231d4: Status 404 returned error can't find the container with id c31a76fcba6e0a2edf574624c6292a3f51560169f1f1fa309a2ea336e40231d4 Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.350537 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe30cce_8379_4db8_838b_f48b4bc96621.slice/crio-a14572119f65e2d0fbfc63101455582a2b9abfe6948028817dc155c8d3a9c7ab WatchSource:0}: Error finding container a14572119f65e2d0fbfc63101455582a2b9abfe6948028817dc155c8d3a9c7ab: Status 404 returned error can't find the container with id a14572119f65e2d0fbfc63101455582a2b9abfe6948028817dc155c8d3a9c7ab Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.350799 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2a178b03_e81c_47af_898a_0463f964e327.slice/crio-c9aac967fb18b6328851a9397e77955184a1e3f93d91c3163e98cfc3de1758cd WatchSource:0}: Error finding container c9aac967fb18b6328851a9397e77955184a1e3f93d91c3163e98cfc3de1758cd: Status 404 returned error can't find the container with id c9aac967fb18b6328851a9397e77955184a1e3f93d91c3163e98cfc3de1758cd Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.351757 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76aa065c_ed60_4237_b36f_5ce2865256ff.slice/crio-3e81512696f04f227cf371ddcf1556e047699059d617fb7f43f9cba658930f7f WatchSource:0}: Error finding container 3e81512696f04f227cf371ddcf1556e047699059d617fb7f43f9cba658930f7f: Status 404 returned error can't find the container with id 3e81512696f04f227cf371ddcf1556e047699059d617fb7f43f9cba658930f7f Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.357016 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdad0261_804d_41dc_8a25_48018f136c0f.slice/crio-97695118e06c708e4423796822ac54cea80fe3fc3b7289e71d5f6ac300dfeb72 WatchSource:0}: Error finding container 97695118e06c708e4423796822ac54cea80fe3fc3b7289e71d5f6ac300dfeb72: Status 404 returned error can't find the container with id 97695118e06c708e4423796822ac54cea80fe3fc3b7289e71d5f6ac300dfeb72 Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.357427 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6525b40b_1c23_4533_a025_4d86bc406f00.slice/crio-45f9adf9166a73c98a80bad9d037f2560042ccbdeec2aa18a7cbfa8528d64c9a WatchSource:0}: Error finding container 45f9adf9166a73c98a80bad9d037f2560042ccbdeec2aa18a7cbfa8528d64c9a: Status 404 returned error can't find the container with id 45f9adf9166a73c98a80bad9d037f2560042ccbdeec2aa18a7cbfa8528d64c9a Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.367962 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.368489 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.868452474 +0000 UTC m=+225.042159223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.377664 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerStarted","Data":"a14572119f65e2d0fbfc63101455582a2b9abfe6948028817dc155c8d3a9c7ab"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.378950 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2a178b03-e81c-47af-898a-0463f964e327","Type":"ContainerStarted","Data":"c9aac967fb18b6328851a9397e77955184a1e3f93d91c3163e98cfc3de1758cd"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.380025 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerStarted","Data":"c31a76fcba6e0a2edf574624c6292a3f51560169f1f1fa309a2ea336e40231d4"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.381115 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zv2kt" event={"ID":"76aa065c-ed60-4237-b36f-5ce2865256ff","Type":"ContainerStarted","Data":"3e81512696f04f227cf371ddcf1556e047699059d617fb7f43f9cba658930f7f"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.382288 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrh9x" event={"ID":"a4a4dc2d-502f-4c05-ab76-1cc708f13006","Type":"ContainerStarted","Data":"5863f96f41db6fc401ecb7000e3f5a6cfef96961acdd0f3f461004c58668116e"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.383292 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwzk5" event={"ID":"fdad0261-804d-41dc-8a25-48018f136c0f","Type":"ContainerStarted","Data":"97695118e06c708e4423796822ac54cea80fe3fc3b7289e71d5f6ac300dfeb72"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.384316 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerStarted","Data":"45f9adf9166a73c98a80bad9d037f2560042ccbdeec2aa18a7cbfa8528d64c9a"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.385390 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4tz9" event={"ID":"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5","Type":"ContainerStarted","Data":"7734e5951b02bb3a0a46ea5a16ee396269ee4c95d8725567063c864699e319c0"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.387515 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzrd2" event={"ID":"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c","Type":"ContainerStarted","Data":"d1b32d28a2daabcbb6951ddc2404e012b74605f090a8de0ccde979112a9da8a3"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.469276 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.469703 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.969687878 +0000 UTC m=+225.143394637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.573441 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.573850 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.073833046 +0000 UTC m=+225.247539795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.674603 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.674913 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.174901506 +0000 UTC m=+225.348608255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.776021 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.776132 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.27611558 +0000 UTC m=+225.449822329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.776308 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.776574 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.276566092 +0000 UTC m=+225.450272841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.854949 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:10 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:10 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:10 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.855012 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.877543 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.877796 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.377747716 +0000 UTC m=+225.551454475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.878027 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.878481 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.378470214 +0000 UTC m=+225.552176963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.978825 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.979064 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.479022691 +0000 UTC m=+225.652729440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.979552 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.980036 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.480028076 +0000 UTC m=+225.653734825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.080591 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.081108 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.581092976 +0000 UTC m=+225.754799725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.182490 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.182894 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.682862064 +0000 UTC m=+225.856568813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.226275 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c67ff489b-x49rf"] Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.276358 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg"] Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.284008 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.284173 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.784145681 +0000 UTC m=+225.957852430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.284222 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.284665 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.784653494 +0000 UTC m=+225.958360243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.385281 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.385656 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.885640722 +0000 UTC m=+226.059347471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.398996 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" event={"ID":"51ce0bbc-ee87-47f6-be5d-24f40386cb60","Type":"ContainerStarted","Data":"b99811bd76278a20c75ea5a5530b5792fd876cafc8ee3f721f73cedbdb0b24d7"} Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.403703 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-87hn7_9a1c67e3-f6df-4b4d-b3a3-669503580446/cluster-samples-operator/0.log" Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.403794 4994 generic.go:334] "Generic (PLEG): container finished" podID="9a1c67e3-f6df-4b4d-b3a3-669503580446" containerID="03ebf9cde09f85323fd76e933846dfc1dfd8ba1c198723c3174424b6bd6a1144" exitCode=2 Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.403858 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" event={"ID":"9a1c67e3-f6df-4b4d-b3a3-669503580446","Type":"ContainerDied","Data":"03ebf9cde09f85323fd76e933846dfc1dfd8ba1c198723c3174424b6bd6a1144"} Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.404553 4994 scope.go:117] "RemoveContainer" containerID="03ebf9cde09f85323fd76e933846dfc1dfd8ba1c198723c3174424b6bd6a1144" Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.407582 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerStarted","Data":"b12c2570f0f12ececa7d019201ed8ccc106ee186110fc56077d24c5532fccef4"} Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.415677 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" event={"ID":"aa106de9-72a4-4364-a10d-2ec2c543afcf","Type":"ContainerStarted","Data":"8a78cdbee32124e2065f39d9cf54d4202c2bf87b8ec1b372bda9861fe5ee8d02"} Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.486703 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.487296 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.987278947 +0000 UTC m=+226.160985756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.588745 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.589027 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.088987473 +0000 UTC m=+226.262694232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.589114 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.589415 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.089401964 +0000 UTC m=+226.263108713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.690675 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.690971 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.190946196 +0000 UTC m=+226.364652945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.691184 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.692709 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.19269974 +0000 UTC m=+226.366406489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.792533 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.792807 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.292789696 +0000 UTC m=+226.466496445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.793183 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.793547 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.293539725 +0000 UTC m=+226.467246474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.855041 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:11 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:11 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:11 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.855366 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.894750 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.895342 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.395311323 +0000 UTC m=+226.569018082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.996098 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.996661 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.49664629 +0000 UTC m=+226.670353039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.098196 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.099157 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.599141587 +0000 UTC m=+226.772848336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.184290 4994 patch_prober.go:28] interesting pod/console-f9d7485db-rlqtz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.184766 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rlqtz" podUID="11b78073-cc4a-4a6f-89ab-631fde4b3371" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.200224 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.200654 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.700643958 +0000 UTC m=+226.874350707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.301212 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.301516 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.801465273 +0000 UTC m=+226.975172032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.301593 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.301957 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.801942274 +0000 UTC m=+226.975649023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.402643 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.402907 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.902887952 +0000 UTC m=+227.076594701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.403016 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.403321 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.903310022 +0000 UTC m=+227.077016771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.426854 4994 generic.go:334] "Generic (PLEG): container finished" podID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerID="b4775c7cc3dc93dde45a7cd1c8d5a247763c1a3b907807a2f3655ae2194f4c42" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.426990 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4tz9" event={"ID":"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5","Type":"ContainerDied","Data":"b4775c7cc3dc93dde45a7cd1c8d5a247763c1a3b907807a2f3655ae2194f4c42"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.430282 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-87hn7_9a1c67e3-f6df-4b4d-b3a3-669503580446/cluster-samples-operator/0.log" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.430469 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" event={"ID":"9a1c67e3-f6df-4b4d-b3a3-669503580446","Type":"ContainerStarted","Data":"0a65910acc46447be87b6e67907b65f26084070a5be8a0b720524a4f076a1bbe"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.433481 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2a178b03-e81c-47af-898a-0463f964e327","Type":"ContainerStarted","Data":"d5e688aeef6f62de6c564f71497e2a41fce184df58655c98c950861c2322f5d8"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.435031 4994 generic.go:334] "Generic (PLEG): container finished" podID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerID="f5717a500fcfc936aa966df3e8984d98f5ff5ab90d17718d04d543deea170e1a" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.435100 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrh9x" event={"ID":"a4a4dc2d-502f-4c05-ab76-1cc708f13006","Type":"ContainerDied","Data":"f5717a500fcfc936aa966df3e8984d98f5ff5ab90d17718d04d543deea170e1a"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.436925 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" event={"ID":"aa106de9-72a4-4364-a10d-2ec2c543afcf","Type":"ContainerStarted","Data":"eead1cddbea92663bf5592f78a9fd2d9a4a50429baaf76e2d49ea3e824e4b343"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.437640 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.439644 4994 generic.go:334] "Generic (PLEG): container finished" podID="6525b40b-1c23-4533-a025-4d86bc406f00" containerID="aae194d8e4c12d216b3165e539102c03919767ba8d6987e2d169c5147eb55863" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.439701 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerDied","Data":"aae194d8e4c12d216b3165e539102c03919767ba8d6987e2d169c5147eb55863"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.443700 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.444754 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" event={"ID":"51ce0bbc-ee87-47f6-be5d-24f40386cb60","Type":"ContainerStarted","Data":"d0af519df7d4889c7f4e2a422bbfa5a0aa335234246b0389b4653db0004e1db2"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.444939 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.458119 4994 generic.go:334] "Generic (PLEG): container finished" podID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerID="80f8634f7b8323c210693d621de8d8f6643dfa095b77ff7b2c7b90894cebf6e9" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.458246 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzrd2" event={"ID":"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c","Type":"ContainerDied","Data":"80f8634f7b8323c210693d621de8d8f6643dfa095b77ff7b2c7b90894cebf6e9"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.462505 4994 generic.go:334] "Generic (PLEG): container finished" podID="0429fae4-1356-4d61-86a3-267f74f27636" containerID="5520b611519111e180e88d1153308daf75771503d82a595371d6519dba75f44f" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.462652 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerDied","Data":"5520b611519111e180e88d1153308daf75771503d82a595371d6519dba75f44f"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.468385 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.473060 4994 generic.go:334] "Generic (PLEG): container finished" podID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerID="ca43f34122075e40b3a59998e1c0fdcc5eee5438f96f373d4f9e4b36228204ee" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.473376 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zv2kt" event={"ID":"76aa065c-ed60-4237-b36f-5ce2865256ff","Type":"ContainerDied","Data":"ca43f34122075e40b3a59998e1c0fdcc5eee5438f96f373d4f9e4b36228204ee"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.478645 4994 generic.go:334] "Generic (PLEG): container finished" podID="abe30cce-8379-4db8-838b-f48b4bc96621" containerID="b12c2570f0f12ececa7d019201ed8ccc106ee186110fc56077d24c5532fccef4" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.478730 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerDied","Data":"b12c2570f0f12ececa7d019201ed8ccc106ee186110fc56077d24c5532fccef4"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.489989 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=12.489970212 podStartE2EDuration="12.489970212s" podCreationTimestamp="2026-03-10 00:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:12.486034624 +0000 UTC m=+226.659741373" watchObservedRunningTime="2026-03-10 00:10:12.489970212 +0000 UTC m=+226.663676961" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.490091 4994 generic.go:334] "Generic (PLEG): container finished" podID="fdad0261-804d-41dc-8a25-48018f136c0f" containerID="d9c63ca86073ed9073e0f89d99a6a3af753621532c12e53bb512c115a6852ded" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.490123 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwzk5" event={"ID":"fdad0261-804d-41dc-8a25-48018f136c0f","Type":"ContainerDied","Data":"d9c63ca86073ed9073e0f89d99a6a3af753621532c12e53bb512c115a6852ded"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.499486 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.499529 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.499537 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.499587 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.507435 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.507640 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:13.007621334 +0000 UTC m=+227.181328083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.507812 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.508241 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:13.00822333 +0000 UTC m=+227.181930089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.588488 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" podStartSLOduration=11.588475928 podStartE2EDuration="11.588475928s" podCreationTimestamp="2026-03-10 00:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:12.585684078 +0000 UTC m=+226.759390827" watchObservedRunningTime="2026-03-10 00:10:12.588475928 +0000 UTC m=+226.762182677" Mar 10 00:10:13 crc kubenswrapper[4994]: I0310 00:10:13.872057 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" podStartSLOduration=12.872040886 podStartE2EDuration="12.872040886s" podCreationTimestamp="2026-03-10 00:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:13.868932888 +0000 UTC m=+228.042639637" watchObservedRunningTime="2026-03-10 00:10:13.872040886 +0000 UTC m=+228.045747635" Mar 10 00:10:13 crc kubenswrapper[4994]: I0310 00:10:13.873855 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:13 crc kubenswrapper[4994]: E0310 00:10:13.874689 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.874668402 +0000 UTC m=+229.048375151 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:13 crc kubenswrapper[4994]: I0310 00:10:13.893847 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:13 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:13 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:13 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:13 crc kubenswrapper[4994]: I0310 00:10:13.894005 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:13 crc kubenswrapper[4994]: I0310 00:10:13.977530 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:13 crc kubenswrapper[4994]: E0310 00:10:13.979096 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.479075177 +0000 UTC m=+228.652781926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.086771 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.087128 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.587112451 +0000 UTC m=+228.760819200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.188021 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.188173 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.688149901 +0000 UTC m=+228.861856650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.188217 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.188517 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.68850978 +0000 UTC m=+228.862216519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.289490 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.289669 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.789644013 +0000 UTC m=+228.963350762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.290056 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.290376 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.790369541 +0000 UTC m=+228.964076290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.391589 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.391768 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.891735679 +0000 UTC m=+229.065442428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.392025 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.392406 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.892396645 +0000 UTC m=+229.066103394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.493738 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.493954 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.993918017 +0000 UTC m=+229.167624776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.494551 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.494981 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.994964754 +0000 UTC m=+229.168671503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.595812 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.596091 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.096063805 +0000 UTC m=+229.269770564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.596310 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.596757 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.096741102 +0000 UTC m=+229.270447851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.697697 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.697926 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.197862554 +0000 UTC m=+229.371569303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.698034 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.698338 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.198326355 +0000 UTC m=+229.372033104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.799680 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.799860 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.299832707 +0000 UTC m=+229.473539456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.799992 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.800304 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.300296769 +0000 UTC m=+229.474003518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.854786 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:14 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:14 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:14 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.854838 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.900488 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.900682 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.400649832 +0000 UTC m=+229.574356581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.900824 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.901180 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.401168745 +0000 UTC m=+229.574875494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.903993 4994 generic.go:334] "Generic (PLEG): container finished" podID="2a178b03-e81c-47af-898a-0463f964e327" containerID="d5e688aeef6f62de6c564f71497e2a41fce184df58655c98c950861c2322f5d8" exitCode=0 Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.904064 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2a178b03-e81c-47af-898a-0463f964e327","Type":"ContainerDied","Data":"d5e688aeef6f62de6c564f71497e2a41fce184df58655c98c950861c2322f5d8"} Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.905666 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" event={"ID":"f04aae5d-b067-4e49-82f3-66412ec1bba6","Type":"ContainerStarted","Data":"669d56e78759519de5a6dd239e9cf24e944424e3eb64de05a26d842e32401407"} Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.907580 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" event={"ID":"a1456dd8-5038-4bcc-8f19-51325ac84c02","Type":"ContainerStarted","Data":"ee78e5054ad5ad8a035342e7024985079f992d0c77022319d8f1e7f3d55f9eb9"} Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.910285 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" event={"ID":"7fd7640d-700a-420e-b15f-7f681090727b","Type":"ContainerStarted","Data":"41a63e3cad206deed5794d5cf2ffcfa63552cbdbe5b01fdda5d535ac3a1fe33c"} Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.941671 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" podStartSLOduration=117.689488346 podStartE2EDuration="2m14.941655258s" podCreationTimestamp="2026-03-10 00:08:00 +0000 UTC" firstStartedPulling="2026-03-10 00:09:53.793929549 +0000 UTC m=+207.967636298" lastFinishedPulling="2026-03-10 00:10:11.046096461 +0000 UTC m=+225.219803210" observedRunningTime="2026-03-10 00:10:14.939637218 +0000 UTC m=+229.113343967" watchObservedRunningTime="2026-03-10 00:10:14.941655258 +0000 UTC m=+229.115362007" Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.953331 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" podStartSLOduration=10.642376352 podStartE2EDuration="14.953312929s" podCreationTimestamp="2026-03-10 00:10:00 +0000 UTC" firstStartedPulling="2026-03-10 00:10:06.810199502 +0000 UTC m=+220.983906251" lastFinishedPulling="2026-03-10 00:10:11.121136079 +0000 UTC m=+225.294842828" observedRunningTime="2026-03-10 00:10:14.95014692 +0000 UTC m=+229.123853659" watchObservedRunningTime="2026-03-10 00:10:14.953312929 +0000 UTC m=+229.127019688" Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.002178 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.002370 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.502341097 +0000 UTC m=+229.676047856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.002599 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.004175 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.504161563 +0000 UTC m=+229.677868312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.104309 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.104488 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.604461534 +0000 UTC m=+229.778168293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.104897 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.105248 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.605236634 +0000 UTC m=+229.778943373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.204446 4994 csr.go:261] certificate signing request csr-tm2rk is approved, waiting to be issued Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.207891 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.208001 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.707976686 +0000 UTC m=+229.881683435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.208100 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.208413 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.708405696 +0000 UTC m=+229.882112445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.208700 4994 csr.go:257] certificate signing request csr-tm2rk is issued Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.310016 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.310357 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.810342659 +0000 UTC m=+229.984049408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.411889 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.412249 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.9122355 +0000 UTC m=+230.085942249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.512517 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.512649 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.012629844 +0000 UTC m=+230.186336593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.513035 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.513359 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.013343611 +0000 UTC m=+230.187050360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.614365 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.614558 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.114530775 +0000 UTC m=+230.288237524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.614590 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.614939 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.114931916 +0000 UTC m=+230.288638655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.715255 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.715592 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.215577836 +0000 UTC m=+230.389284585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.727552 4994 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.822505 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.822819 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.3228063 +0000 UTC m=+230.496513049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.859800 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:15 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:15 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:15 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.859858 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.874394 4994 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T00:10:15.727582296Z","Handler":null,"Name":""} Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.917938 4994 generic.go:334] "Generic (PLEG): container finished" podID="f04aae5d-b067-4e49-82f3-66412ec1bba6" containerID="669d56e78759519de5a6dd239e9cf24e944424e3eb64de05a26d842e32401407" exitCode=0 Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.918006 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" event={"ID":"f04aae5d-b067-4e49-82f3-66412ec1bba6","Type":"ContainerDied","Data":"669d56e78759519de5a6dd239e9cf24e944424e3eb64de05a26d842e32401407"} Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.920212 4994 generic.go:334] "Generic (PLEG): container finished" podID="a1456dd8-5038-4bcc-8f19-51325ac84c02" containerID="ee78e5054ad5ad8a035342e7024985079f992d0c77022319d8f1e7f3d55f9eb9" exitCode=0 Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.920256 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" event={"ID":"a1456dd8-5038-4bcc-8f19-51325ac84c02","Type":"ContainerDied","Data":"ee78e5054ad5ad8a035342e7024985079f992d0c77022319d8f1e7f3d55f9eb9"} Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.924130 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.924357 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.424334332 +0000 UTC m=+230.598041121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.932240 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" event={"ID":"7fd7640d-700a-420e-b15f-7f681090727b","Type":"ContainerStarted","Data":"51f4c87762c30d5a257d71a06dfa2c77c08a659ae1961b259ddd8f58a65eeb3f"} Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.932286 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" event={"ID":"7fd7640d-700a-420e-b15f-7f681090727b","Type":"ContainerStarted","Data":"b86c8f290ec062a91de2823bf6419962c51cff1c96848b8dca59f8631b305c5c"} Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.256003 4994 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-02 11:23:51.115399071 +0000 UTC Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.256046 4994 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7163h13m34.859355624s for next certificate rotation Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.257357 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:16 crc kubenswrapper[4994]: E0310 00:10:16.258244 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.758233012 +0000 UTC m=+230.931939761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.953734 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:16 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:16 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:16 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.953792 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.954411 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:16 crc kubenswrapper[4994]: E0310 00:10:16.979641 4994 goroutinemap.go:150] Operation for "/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" failed. No retries permitted until 2026-03-10 00:10:17.479621795 +0000 UTC m=+231.653328544 (durationBeforeRetry 500ms). Error: RegisterPlugin error -- failed to get plugin info using RPC GetInfo at socket /var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock, err: rpc error: code = DeadlineExceeded desc = context deadline exceeded Mar 10 00:10:16 crc kubenswrapper[4994]: E0310 00:10:16.979746 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:17.979728757 +0000 UTC m=+232.153435506 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.990692 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" podUID="7fd7640d-700a-420e-b15f-7f681090727b" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.45:9898/healthz\": dial tcp 10.217.0.45:9898: connect: connection refused" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.049230 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" podStartSLOduration=28.049209037 podStartE2EDuration="28.049209037s" podCreationTimestamp="2026-03-10 00:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:17.044800396 +0000 UTC m=+231.218507155" watchObservedRunningTime="2026-03-10 00:10:17.049209037 +0000 UTC m=+231.222915776" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.056135 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.057040 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:17.557026112 +0000 UTC m=+231.730732861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.160717 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.161029 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:17.661018086 +0000 UTC m=+231.834724835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.257136 4994 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-01 10:11:04.698311644 +0000 UTC Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.257169 4994 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6394h0m47.4411447s for next certificate rotation Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.262239 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.262475 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:17.762449986 +0000 UTC m=+231.936156735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.301724 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.363738 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.364030 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:17.864018829 +0000 UTC m=+232.037725578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.444081 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.465260 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.465401 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a178b03-e81c-47af-898a-0463f964e327-kubelet-dir\") pod \"2a178b03-e81c-47af-898a-0463f964e327\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.465426 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a178b03-e81c-47af-898a-0463f964e327-kube-api-access\") pod \"2a178b03-e81c-47af-898a-0463f964e327\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.465792 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:17.965759626 +0000 UTC m=+232.139466385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.465928 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a178b03-e81c-47af-898a-0463f964e327-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2a178b03-e81c-47af-898a-0463f964e327" (UID: "2a178b03-e81c-47af-898a-0463f964e327"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.470803 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a178b03-e81c-47af-898a-0463f964e327-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2a178b03-e81c-47af-898a-0463f964e327" (UID: "2a178b03-e81c-47af-898a-0463f964e327"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.490173 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.566335 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97ch2\" (UniqueName: \"kubernetes.io/projected/f04aae5d-b067-4e49-82f3-66412ec1bba6-kube-api-access-97ch2\") pod \"f04aae5d-b067-4e49-82f3-66412ec1bba6\" (UID: \"f04aae5d-b067-4e49-82f3-66412ec1bba6\") " Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.566700 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.566755 4994 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a178b03-e81c-47af-898a-0463f964e327-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.566768 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a178b03-e81c-47af-898a-0463f964e327-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.567038 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:18.067028551 +0000 UTC m=+232.240735300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.570769 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04aae5d-b067-4e49-82f3-66412ec1bba6-kube-api-access-97ch2" (OuterVolumeSpecName: "kube-api-access-97ch2") pod "f04aae5d-b067-4e49-82f3-66412ec1bba6" (UID: "f04aae5d-b067-4e49-82f3-66412ec1bba6"). InnerVolumeSpecName "kube-api-access-97ch2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.668350 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.668427 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4fvw\" (UniqueName: \"kubernetes.io/projected/a1456dd8-5038-4bcc-8f19-51325ac84c02-kube-api-access-m4fvw\") pod \"a1456dd8-5038-4bcc-8f19-51325ac84c02\" (UID: \"a1456dd8-5038-4bcc-8f19-51325ac84c02\") " Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.668591 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:18.168558854 +0000 UTC m=+232.342265613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.668814 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.669210 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:18.16918125 +0000 UTC m=+232.342887999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.669260 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97ch2\" (UniqueName: \"kubernetes.io/projected/f04aae5d-b067-4e49-82f3-66412ec1bba6-kube-api-access-97ch2\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.672647 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1456dd8-5038-4bcc-8f19-51325ac84c02-kube-api-access-m4fvw" (OuterVolumeSpecName: "kube-api-access-m4fvw") pod "a1456dd8-5038-4bcc-8f19-51325ac84c02" (UID: "a1456dd8-5038-4bcc-8f19-51325ac84c02"). InnerVolumeSpecName "kube-api-access-m4fvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.770684 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.771116 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4fvw\" (UniqueName: \"kubernetes.io/projected/a1456dd8-5038-4bcc-8f19-51325ac84c02-kube-api-access-m4fvw\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.771197 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:18.271180013 +0000 UTC m=+232.444886762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.853201 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:17 crc kubenswrapper[4994]: [+]has-synced ok Mar 10 00:10:17 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:17 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.853262 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.872894 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.873198 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:18.373186588 +0000 UTC m=+232.546893337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.953438 4994 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T00:10:15.727582296Z","Handler":null,"Name":""} Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.958029 4994 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.958065 4994 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.974566 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.979652 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.014445 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" event={"ID":"f04aae5d-b067-4e49-82f3-66412ec1bba6","Type":"ContainerDied","Data":"4311de40639f67ea8a55d45863ea9b8bade3cfa62b21c735dd8d526f7c5e805a"} Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.015101 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4311de40639f67ea8a55d45863ea9b8bade3cfa62b21c735dd8d526f7c5e805a" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.015060 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.023195 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.023207 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" event={"ID":"a1456dd8-5038-4bcc-8f19-51325ac84c02","Type":"ContainerDied","Data":"5834bb42526a131e58f316ca58bdd93f998331f2c74c80c7a568c0b2a5d292c8"} Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.023554 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5834bb42526a131e58f316ca58bdd93f998331f2c74c80c7a568c0b2a5d292c8" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.026345 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2a178b03-e81c-47af-898a-0463f964e327","Type":"ContainerDied","Data":"c9aac967fb18b6328851a9397e77955184a1e3f93d91c3163e98cfc3de1758cd"} Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.026364 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9aac967fb18b6328851a9397e77955184a1e3f93d91c3163e98cfc3de1758cd" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.026411 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:18 crc kubenswrapper[4994]: E0310 00:10:18.062149 4994 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf04aae5d_b067_4e49_82f3_66412ec1bba6.slice\": RecentStats: unable to find data in memory cache]" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.076394 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.452154 4994 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.452228 4994 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.560599 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.606385 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.625859 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.826635 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-75h8c"] Mar 10 00:10:18 crc kubenswrapper[4994]: W0310 00:10:18.833394 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod295cba62_fd24_4245_8773_866ee134a29e.slice/crio-dd7db90166bf3d060ca8294e206e48fea14498e48c7b2ef7fe0c1e7d9d4dd09f WatchSource:0}: Error finding container dd7db90166bf3d060ca8294e206e48fea14498e48c7b2ef7fe0c1e7d9d4dd09f: Status 404 returned error can't find the container with id dd7db90166bf3d060ca8294e206e48fea14498e48c7b2ef7fe0c1e7d9d4dd09f Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.892778 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.892823 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.894346 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.904766 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:10:19 crc kubenswrapper[4994]: I0310 00:10:19.033801 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" event={"ID":"295cba62-fd24-4245-8773-866ee134a29e","Type":"ContainerStarted","Data":"dd7db90166bf3d060ca8294e206e48fea14498e48c7b2ef7fe0c1e7d9d4dd09f"} Mar 10 00:10:21 crc kubenswrapper[4994]: I0310 00:10:21.069800 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c67ff489b-x49rf"] Mar 10 00:10:21 crc kubenswrapper[4994]: I0310 00:10:21.070177 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg"] Mar 10 00:10:21 crc kubenswrapper[4994]: I0310 00:10:21.072126 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" podUID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" containerName="route-controller-manager" containerID="cri-o://d0af519df7d4889c7f4e2a422bbfa5a0aa335234246b0389b4653db0004e1db2" gracePeriod=30 Mar 10 00:10:21 crc kubenswrapper[4994]: I0310 00:10:21.072676 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" containerID="cri-o://eead1cddbea92663bf5592f78a9fd2d9a4a50429baaf76e2d49ea3e824e4b343" gracePeriod=30 Mar 10 00:10:21 crc kubenswrapper[4994]: I0310 00:10:21.082768 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" event={"ID":"295cba62-fd24-4245-8773-866ee134a29e","Type":"ContainerStarted","Data":"0b99026327a0246e8d6a6998d063d7da1dc8dded77c229f16aea5a63dc4137ba"} Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.354202 4994 patch_prober.go:28] interesting pod/console-f9d7485db-rlqtz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.354254 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rlqtz" podUID="11b78073-cc4a-4a6f-89ab-631fde4b3371" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.356282 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.407349 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" podStartSLOduration=191.407329733 podStartE2EDuration="3m11.407329733s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:22.397395945 +0000 UTC m=+236.571102694" watchObservedRunningTime="2026-03-10 00:10:22.407329733 +0000 UTC m=+236.581036482" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.499853 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.500079 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.500238 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.500308 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.500292 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.501342 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"131f6f968be699b4510e1711ff70f7d98fd24e9b749c0ac094982fa64eb070f5"} pod="openshift-console/downloads-7954f5f757-8lrmb" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.501383 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" containerID="cri-o://131f6f968be699b4510e1711ff70f7d98fd24e9b749c0ac094982fa64eb070f5" gracePeriod=2 Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.501652 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.501707 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:24 crc kubenswrapper[4994]: I0310 00:10:24.373077 4994 generic.go:334] "Generic (PLEG): container finished" podID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerID="131f6f968be699b4510e1711ff70f7d98fd24e9b749c0ac094982fa64eb070f5" exitCode=0 Mar 10 00:10:24 crc kubenswrapper[4994]: I0310 00:10:24.373254 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lrmb" event={"ID":"4fb67636-fcba-4975-a460-403cd6ee9c25","Type":"ContainerDied","Data":"131f6f968be699b4510e1711ff70f7d98fd24e9b749c0ac094982fa64eb070f5"} Mar 10 00:10:24 crc kubenswrapper[4994]: I0310 00:10:24.376104 4994 generic.go:334] "Generic (PLEG): container finished" podID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" containerID="d0af519df7d4889c7f4e2a422bbfa5a0aa335234246b0389b4653db0004e1db2" exitCode=0 Mar 10 00:10:24 crc kubenswrapper[4994]: I0310 00:10:24.376172 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" event={"ID":"51ce0bbc-ee87-47f6-be5d-24f40386cb60","Type":"ContainerDied","Data":"d0af519df7d4889c7f4e2a422bbfa5a0aa335234246b0389b4653db0004e1db2"} Mar 10 00:10:24 crc kubenswrapper[4994]: I0310 00:10:24.378490 4994 generic.go:334] "Generic (PLEG): container finished" podID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerID="eead1cddbea92663bf5592f78a9fd2d9a4a50429baaf76e2d49ea3e824e4b343" exitCode=0 Mar 10 00:10:24 crc kubenswrapper[4994]: I0310 00:10:24.378523 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" event={"ID":"aa106de9-72a4-4364-a10d-2ec2c543afcf","Type":"ContainerDied","Data":"eead1cddbea92663bf5592f78a9fd2d9a4a50429baaf76e2d49ea3e824e4b343"} Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.201245 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.232484 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs"] Mar 10 00:10:30 crc kubenswrapper[4994]: E0310 00:10:30.232987 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04aae5d-b067-4e49-82f3-66412ec1bba6" containerName="oc" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233004 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04aae5d-b067-4e49-82f3-66412ec1bba6" containerName="oc" Mar 10 00:10:30 crc kubenswrapper[4994]: E0310 00:10:30.233027 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1456dd8-5038-4bcc-8f19-51325ac84c02" containerName="oc" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233034 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1456dd8-5038-4bcc-8f19-51325ac84c02" containerName="oc" Mar 10 00:10:30 crc kubenswrapper[4994]: E0310 00:10:30.233046 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" containerName="route-controller-manager" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233054 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" containerName="route-controller-manager" Mar 10 00:10:30 crc kubenswrapper[4994]: E0310 00:10:30.233072 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a178b03-e81c-47af-898a-0463f964e327" containerName="pruner" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233080 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a178b03-e81c-47af-898a-0463f964e327" containerName="pruner" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233300 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1456dd8-5038-4bcc-8f19-51325ac84c02" containerName="oc" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233316 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a178b03-e81c-47af-898a-0463f964e327" containerName="pruner" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233330 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04aae5d-b067-4e49-82f3-66412ec1bba6" containerName="oc" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233346 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" containerName="route-controller-manager" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233909 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.248567 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs"] Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.386806 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-client-ca\") pod \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387005 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ce0bbc-ee87-47f6-be5d-24f40386cb60-serving-cert\") pod \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387043 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-config\") pod \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387075 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vb2m\" (UniqueName: \"kubernetes.io/projected/51ce0bbc-ee87-47f6-be5d-24f40386cb60-kube-api-access-6vb2m\") pod \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387254 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ffb940-ad05-42fe-99dc-2ca36481a566-serving-cert\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387319 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-config\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387346 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-client-ca\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387386 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r94jz\" (UniqueName: \"kubernetes.io/projected/f9ffb940-ad05-42fe-99dc-2ca36481a566-kube-api-access-r94jz\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387967 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-config" (OuterVolumeSpecName: "config") pod "51ce0bbc-ee87-47f6-be5d-24f40386cb60" (UID: "51ce0bbc-ee87-47f6-be5d-24f40386cb60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.388112 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-client-ca" (OuterVolumeSpecName: "client-ca") pod "51ce0bbc-ee87-47f6-be5d-24f40386cb60" (UID: "51ce0bbc-ee87-47f6-be5d-24f40386cb60"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.395129 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ce0bbc-ee87-47f6-be5d-24f40386cb60-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "51ce0bbc-ee87-47f6-be5d-24f40386cb60" (UID: "51ce0bbc-ee87-47f6-be5d-24f40386cb60"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.401139 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ce0bbc-ee87-47f6-be5d-24f40386cb60-kube-api-access-6vb2m" (OuterVolumeSpecName: "kube-api-access-6vb2m") pod "51ce0bbc-ee87-47f6-be5d-24f40386cb60" (UID: "51ce0bbc-ee87-47f6-be5d-24f40386cb60"). InnerVolumeSpecName "kube-api-access-6vb2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.413042 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" event={"ID":"51ce0bbc-ee87-47f6-be5d-24f40386cb60","Type":"ContainerDied","Data":"b99811bd76278a20c75ea5a5530b5792fd876cafc8ee3f721f73cedbdb0b24d7"} Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.413157 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.413312 4994 scope.go:117] "RemoveContainer" containerID="d0af519df7d4889c7f4e2a422bbfa5a0aa335234246b0389b4653db0004e1db2" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.463361 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg"] Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.469467 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg"] Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.489754 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ffb940-ad05-42fe-99dc-2ca36481a566-serving-cert\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.489968 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-config\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.492377 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-config\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.494364 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-client-ca\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.492567 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-client-ca\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.494550 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r94jz\" (UniqueName: \"kubernetes.io/projected/f9ffb940-ad05-42fe-99dc-2ca36481a566-kube-api-access-r94jz\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.495243 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.495311 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vb2m\" (UniqueName: \"kubernetes.io/projected/51ce0bbc-ee87-47f6-be5d-24f40386cb60-kube-api-access-6vb2m\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.495256 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ffb940-ad05-42fe-99dc-2ca36481a566-serving-cert\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.495344 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.495512 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ce0bbc-ee87-47f6-be5d-24f40386cb60-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.514322 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r94jz\" (UniqueName: \"kubernetes.io/projected/f9ffb940-ad05-42fe-99dc-2ca36481a566-kube-api-access-r94jz\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.564330 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.565554 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" path="/var/lib/kubelet/pods/51ce0bbc-ee87-47f6-be5d-24f40386cb60/volumes" Mar 10 00:10:31 crc kubenswrapper[4994]: I0310 00:10:31.012277 4994 patch_prober.go:28] interesting pod/route-controller-manager-6df8f76c79-nrqgg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:10:31 crc kubenswrapper[4994]: I0310 00:10:31.012366 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" podUID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:10:31 crc kubenswrapper[4994]: I0310 00:10:31.017637 4994 patch_prober.go:28] interesting pod/controller-manager-5c67ff489b-x49rf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:10:31 crc kubenswrapper[4994]: I0310 00:10:31.017685 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:10:32 crc kubenswrapper[4994]: I0310 00:10:32.190659 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:10:32 crc kubenswrapper[4994]: I0310 00:10:32.198825 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:10:32 crc kubenswrapper[4994]: I0310 00:10:32.287929 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:10:32 crc kubenswrapper[4994]: I0310 00:10:32.499300 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:32 crc kubenswrapper[4994]: I0310 00:10:32.499350 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:34 crc kubenswrapper[4994]: I0310 00:10:34.980669 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 00:10:34 crc kubenswrapper[4994]: I0310 00:10:34.981932 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:34 crc kubenswrapper[4994]: I0310 00:10:34.984062 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 00:10:34 crc kubenswrapper[4994]: I0310 00:10:34.984540 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 00:10:34 crc kubenswrapper[4994]: I0310 00:10:34.989585 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.068517 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.068675 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.169795 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.169862 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.170258 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.200243 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.302069 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:37 crc kubenswrapper[4994]: I0310 00:10:37.023022 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:10:38 crc kubenswrapper[4994]: I0310 00:10:38.638419 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.774703 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.776711 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.797540 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.890521 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kube-api-access\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.890741 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.890878 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-var-lock\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.992150 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kube-api-access\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.992280 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.992369 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-var-lock\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.992421 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.992472 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-var-lock\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:41 crc kubenswrapper[4994]: I0310 00:10:41.017151 4994 patch_prober.go:28] interesting pod/controller-manager-5c67ff489b-x49rf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:10:41 crc kubenswrapper[4994]: I0310 00:10:41.017221 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:10:41 crc kubenswrapper[4994]: I0310 00:10:41.018380 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kube-api-access\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:41 crc kubenswrapper[4994]: I0310 00:10:41.115010 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:41 crc kubenswrapper[4994]: I0310 00:10:41.149341 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs"] Mar 10 00:10:42 crc kubenswrapper[4994]: I0310 00:10:42.501102 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:42 crc kubenswrapper[4994]: I0310 00:10:42.501185 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:48 crc kubenswrapper[4994]: I0310 00:10:48.893230 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:10:48 crc kubenswrapper[4994]: I0310 00:10:48.893327 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:10:51 crc kubenswrapper[4994]: I0310 00:10:51.022430 4994 patch_prober.go:28] interesting pod/controller-manager-5c67ff489b-x49rf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:10:51 crc kubenswrapper[4994]: I0310 00:10:51.022948 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:10:51 crc kubenswrapper[4994]: I0310 00:10:51.966862 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.011305 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7987b4b568-rd8h8"] Mar 10 00:10:52 crc kubenswrapper[4994]: E0310 00:10:52.011557 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.011572 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.011733 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.012242 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.048551 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7987b4b568-rd8h8"] Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.135830 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-client-ca\") pod \"aa106de9-72a4-4364-a10d-2ec2c543afcf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.136234 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-proxy-ca-bundles\") pod \"aa106de9-72a4-4364-a10d-2ec2c543afcf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.136356 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-config\") pod \"aa106de9-72a4-4364-a10d-2ec2c543afcf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.136478 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7jgf\" (UniqueName: \"kubernetes.io/projected/aa106de9-72a4-4364-a10d-2ec2c543afcf-kube-api-access-c7jgf\") pod \"aa106de9-72a4-4364-a10d-2ec2c543afcf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.136574 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa106de9-72a4-4364-a10d-2ec2c543afcf-serving-cert\") pod \"aa106de9-72a4-4364-a10d-2ec2c543afcf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.136814 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-client-ca\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.136924 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9lft\" (UniqueName: \"kubernetes.io/projected/ae5ec419-c993-43ff-b664-703b8b5a3d5a-kube-api-access-p9lft\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137000 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-proxy-ca-bundles\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137008 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "aa106de9-72a4-4364-a10d-2ec2c543afcf" (UID: "aa106de9-72a4-4364-a10d-2ec2c543afcf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137035 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-client-ca" (OuterVolumeSpecName: "client-ca") pod "aa106de9-72a4-4364-a10d-2ec2c543afcf" (UID: "aa106de9-72a4-4364-a10d-2ec2c543afcf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137182 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-config" (OuterVolumeSpecName: "config") pod "aa106de9-72a4-4364-a10d-2ec2c543afcf" (UID: "aa106de9-72a4-4364-a10d-2ec2c543afcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137193 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-config\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137403 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5ec419-c993-43ff-b664-703b8b5a3d5a-serving-cert\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137554 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137577 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137591 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.141216 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa106de9-72a4-4364-a10d-2ec2c543afcf-kube-api-access-c7jgf" (OuterVolumeSpecName: "kube-api-access-c7jgf") pod "aa106de9-72a4-4364-a10d-2ec2c543afcf" (UID: "aa106de9-72a4-4364-a10d-2ec2c543afcf"). InnerVolumeSpecName "kube-api-access-c7jgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.145473 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa106de9-72a4-4364-a10d-2ec2c543afcf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aa106de9-72a4-4364-a10d-2ec2c543afcf" (UID: "aa106de9-72a4-4364-a10d-2ec2c543afcf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238682 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-client-ca\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238736 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9lft\" (UniqueName: \"kubernetes.io/projected/ae5ec419-c993-43ff-b664-703b8b5a3d5a-kube-api-access-p9lft\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238762 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-proxy-ca-bundles\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238794 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-config\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238847 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5ec419-c993-43ff-b664-703b8b5a3d5a-serving-cert\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238918 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7jgf\" (UniqueName: \"kubernetes.io/projected/aa106de9-72a4-4364-a10d-2ec2c543afcf-kube-api-access-c7jgf\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238933 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa106de9-72a4-4364-a10d-2ec2c543afcf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.242431 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5ec419-c993-43ff-b664-703b8b5a3d5a-serving-cert\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.315021 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-client-ca\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.316589 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-proxy-ca-bundles\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.317072 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-config\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.321714 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9lft\" (UniqueName: \"kubernetes.io/projected/ae5ec419-c993-43ff-b664-703b8b5a3d5a-kube-api-access-p9lft\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.358678 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.501122 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.501253 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.593304 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" event={"ID":"aa106de9-72a4-4364-a10d-2ec2c543afcf","Type":"ContainerDied","Data":"8a78cdbee32124e2065f39d9cf54d4202c2bf87b8ec1b372bda9861fe5ee8d02"} Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.593770 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.639996 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c67ff489b-x49rf"] Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.646258 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c67ff489b-x49rf"] Mar 10 00:10:54 crc kubenswrapper[4994]: I0310 00:10:54.561400 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" path="/var/lib/kubelet/pods/aa106de9-72a4-4364-a10d-2ec2c543afcf/volumes" Mar 10 00:11:00 crc kubenswrapper[4994]: I0310 00:11:00.506791 4994 scope.go:117] "RemoveContainer" containerID="eead1cddbea92663bf5592f78a9fd2d9a4a50429baaf76e2d49ea3e824e4b343" Mar 10 00:11:02 crc kubenswrapper[4994]: I0310 00:11:02.499154 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:02 crc kubenswrapper[4994]: I0310 00:11:02.499784 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:03 crc kubenswrapper[4994]: E0310 00:11:03.099084 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 00:11:03 crc kubenswrapper[4994]: E0310 00:11:03.099300 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fbrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-s7qcn_openshift-marketplace(abe30cce-8379-4db8-838b-f48b4bc96621): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:03 crc kubenswrapper[4994]: E0310 00:11:03.100669 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-s7qcn" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" Mar 10 00:11:04 crc kubenswrapper[4994]: E0310 00:11:04.804909 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-s7qcn" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" Mar 10 00:11:05 crc kubenswrapper[4994]: E0310 00:11:05.011094 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 00:11:05 crc kubenswrapper[4994]: E0310 00:11:05.011238 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l95st,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bwzk5_openshift-marketplace(fdad0261-804d-41dc-8a25-48018f136c0f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:05 crc kubenswrapper[4994]: E0310 00:11:05.012595 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bwzk5" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" Mar 10 00:11:05 crc kubenswrapper[4994]: E0310 00:11:05.116730 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 00:11:05 crc kubenswrapper[4994]: E0310 00:11:05.116891 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xfh8s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-c4tz9_openshift-marketplace(ab6cd76f-6272-4fcd-8c75-3040c45ef1b5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:05 crc kubenswrapper[4994]: E0310 00:11:05.118413 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-c4tz9" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" Mar 10 00:11:08 crc kubenswrapper[4994]: E0310 00:11:08.946226 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bwzk5" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" Mar 10 00:11:08 crc kubenswrapper[4994]: E0310 00:11:08.946316 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-c4tz9" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" Mar 10 00:11:09 crc kubenswrapper[4994]: E0310 00:11:09.497690 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 10 00:11:09 crc kubenswrapper[4994]: E0310 00:11:09.498120 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-frwzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-t5kj4_openshift-marketplace(0429fae4-1356-4d61-86a3-267f74f27636): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:09 crc kubenswrapper[4994]: E0310 00:11:09.499377 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-t5kj4" podUID="0429fae4-1356-4d61-86a3-267f74f27636" Mar 10 00:11:10 crc kubenswrapper[4994]: E0310 00:11:10.532640 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-t5kj4" podUID="0429fae4-1356-4d61-86a3-267f74f27636" Mar 10 00:11:10 crc kubenswrapper[4994]: E0310 00:11:10.712933 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 00:11:10 crc kubenswrapper[4994]: E0310 00:11:10.713176 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdkv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bzrd2_openshift-marketplace(64ec1b6f-2c0f-4cfc-be18-a2d311fae68c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:10 crc kubenswrapper[4994]: E0310 00:11:10.714348 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bzrd2" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" Mar 10 00:11:11 crc kubenswrapper[4994]: E0310 00:11:11.632642 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 00:11:11 crc kubenswrapper[4994]: E0310 00:11:11.633111 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5lx2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zv2kt_openshift-marketplace(76aa065c-ed60-4237-b36f-5ce2865256ff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:11 crc kubenswrapper[4994]: E0310 00:11:11.635034 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zv2kt" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" Mar 10 00:11:11 crc kubenswrapper[4994]: E0310 00:11:11.736173 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zv2kt" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" Mar 10 00:11:11 crc kubenswrapper[4994]: E0310 00:11:11.737034 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bzrd2" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" Mar 10 00:11:11 crc kubenswrapper[4994]: I0310 00:11:11.779087 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 00:11:11 crc kubenswrapper[4994]: I0310 00:11:11.798114 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs"] Mar 10 00:11:11 crc kubenswrapper[4994]: W0310 00:11:11.802247 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9ffb940_ad05_42fe_99dc_2ca36481a566.slice/crio-0ec14e3f57f4022c2cac12e0b6ad39a5f8eadbca12ac88bfd6e35f901488d9ac WatchSource:0}: Error finding container 0ec14e3f57f4022c2cac12e0b6ad39a5f8eadbca12ac88bfd6e35f901488d9ac: Status 404 returned error can't find the container with id 0ec14e3f57f4022c2cac12e0b6ad39a5f8eadbca12ac88bfd6e35f901488d9ac Mar 10 00:11:11 crc kubenswrapper[4994]: I0310 00:11:11.881061 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 00:11:11 crc kubenswrapper[4994]: I0310 00:11:11.905958 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7987b4b568-rd8h8"] Mar 10 00:11:11 crc kubenswrapper[4994]: W0310 00:11:11.910413 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3b7fc03e_d1af_479d_9315_0f25283f3aa1.slice/crio-561b9844b5f034c62f159c5807ee8bec23ed39a360cd1fa632787a672e2ff7e3 WatchSource:0}: Error finding container 561b9844b5f034c62f159c5807ee8bec23ed39a360cd1fa632787a672e2ff7e3: Status 404 returned error can't find the container with id 561b9844b5f034c62f159c5807ee8bec23ed39a360cd1fa632787a672e2ff7e3 Mar 10 00:11:11 crc kubenswrapper[4994]: W0310 00:11:11.928580 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae5ec419_c993_43ff_b664_703b8b5a3d5a.slice/crio-83f86e0cd6062d10a147cbaca0b7678a7e90a57f0aee2f43c47901ded05219f4 WatchSource:0}: Error finding container 83f86e0cd6062d10a147cbaca0b7678a7e90a57f0aee2f43c47901ded05219f4: Status 404 returned error can't find the container with id 83f86e0cd6062d10a147cbaca0b7678a7e90a57f0aee2f43c47901ded05219f4 Mar 10 00:11:12 crc kubenswrapper[4994]: E0310 00:11:12.112899 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 00:11:12 crc kubenswrapper[4994]: E0310 00:11:12.115181 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btt9c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hrh9x_openshift-marketplace(a4a4dc2d-502f-4c05-ab76-1cc708f13006): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:12 crc kubenswrapper[4994]: E0310 00:11:12.116563 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hrh9x" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.499811 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.500681 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.738996 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerStarted","Data":"c11c2f647a7cbb01d788b2a61a4106505500b1d0634fb464e68d4e4b2d159f7e"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.740100 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3b7fc03e-d1af-479d-9315-0f25283f3aa1","Type":"ContainerStarted","Data":"561b9844b5f034c62f159c5807ee8bec23ed39a360cd1fa632787a672e2ff7e3"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.741956 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" event={"ID":"ae5ec419-c993-43ff-b664-703b8b5a3d5a","Type":"ContainerStarted","Data":"83f86e0cd6062d10a147cbaca0b7678a7e90a57f0aee2f43c47901ded05219f4"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.744164 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" event={"ID":"f9ffb940-ad05-42fe-99dc-2ca36481a566","Type":"ContainerStarted","Data":"3ffee0ac703d08dbee3e08fe1458ad708a9d5512f365eb71414438145464290e"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.744199 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" event={"ID":"f9ffb940-ad05-42fe-99dc-2ca36481a566","Type":"ContainerStarted","Data":"0ec14e3f57f4022c2cac12e0b6ad39a5f8eadbca12ac88bfd6e35f901488d9ac"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.745293 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e7e1505e-1226-47f8-8e43-6ac30a4ff867","Type":"ContainerStarted","Data":"929c6f36bcb8cd761948647dfebae867393436e978350669dcfa51d6e35aaf48"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.748705 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lrmb" event={"ID":"4fb67636-fcba-4975-a460-403cd6ee9c25","Type":"ContainerStarted","Data":"657187cd3846e2eb040a978c60bc565d95b249e43315c01390881e18407946eb"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.748734 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.748804 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.748831 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:12 crc kubenswrapper[4994]: E0310 00:11:12.748970 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hrh9x" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.753972 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e7e1505e-1226-47f8-8e43-6ac30a4ff867","Type":"ContainerStarted","Data":"9048ce472e16494b257b9311b0a7e6443400f3bafde71adbbc641d8d3eac2afb"} Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.756434 4994 generic.go:334] "Generic (PLEG): container finished" podID="6525b40b-1c23-4533-a025-4d86bc406f00" containerID="c11c2f647a7cbb01d788b2a61a4106505500b1d0634fb464e68d4e4b2d159f7e" exitCode=0 Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.756498 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerDied","Data":"c11c2f647a7cbb01d788b2a61a4106505500b1d0634fb464e68d4e4b2d159f7e"} Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.761367 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3b7fc03e-d1af-479d-9315-0f25283f3aa1","Type":"ContainerStarted","Data":"00cdb03a355b97c57210fc1571c46882321cfc03a28cec4f9df0486d60425178"} Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.763962 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" event={"ID":"ae5ec419-c993-43ff-b664-703b8b5a3d5a","Type":"ContainerStarted","Data":"436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514"} Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.764044 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.764083 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" podUID="f9ffb940-ad05-42fe-99dc-2ca36481a566" containerName="route-controller-manager" containerID="cri-o://3ffee0ac703d08dbee3e08fe1458ad708a9d5512f365eb71414438145464290e" gracePeriod=30 Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.764105 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.764131 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.764083 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.769944 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.772943 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.798931 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" podStartSLOduration=52.798906695 podStartE2EDuration="52.798906695s" podCreationTimestamp="2026-03-10 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:11:13.796460493 +0000 UTC m=+287.970167302" watchObservedRunningTime="2026-03-10 00:11:13.798906695 +0000 UTC m=+287.972613444" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.853205 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" podStartSLOduration=32.853180257 podStartE2EDuration="32.853180257s" podCreationTimestamp="2026-03-10 00:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:11:13.851519985 +0000 UTC m=+288.025226744" watchObservedRunningTime="2026-03-10 00:11:13.853180257 +0000 UTC m=+288.026887026" Mar 10 00:11:14 crc kubenswrapper[4994]: I0310 00:11:14.774497 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" event={"ID":"f9ffb940-ad05-42fe-99dc-2ca36481a566","Type":"ContainerDied","Data":"3ffee0ac703d08dbee3e08fe1458ad708a9d5512f365eb71414438145464290e"} Mar 10 00:11:14 crc kubenswrapper[4994]: I0310 00:11:14.774583 4994 generic.go:334] "Generic (PLEG): container finished" podID="f9ffb940-ad05-42fe-99dc-2ca36481a566" containerID="3ffee0ac703d08dbee3e08fe1458ad708a9d5512f365eb71414438145464290e" exitCode=0 Mar 10 00:11:14 crc kubenswrapper[4994]: I0310 00:11:14.778600 4994 generic.go:334] "Generic (PLEG): container finished" podID="3b7fc03e-d1af-479d-9315-0f25283f3aa1" containerID="00cdb03a355b97c57210fc1571c46882321cfc03a28cec4f9df0486d60425178" exitCode=0 Mar 10 00:11:14 crc kubenswrapper[4994]: I0310 00:11:14.778765 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3b7fc03e-d1af-479d-9315-0f25283f3aa1","Type":"ContainerDied","Data":"00cdb03a355b97c57210fc1571c46882321cfc03a28cec4f9df0486d60425178"} Mar 10 00:11:14 crc kubenswrapper[4994]: I0310 00:11:14.842847 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=34.842819648 podStartE2EDuration="34.842819648s" podCreationTimestamp="2026-03-10 00:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:11:14.83224563 +0000 UTC m=+289.005952419" watchObservedRunningTime="2026-03-10 00:11:14.842819648 +0000 UTC m=+289.016526437" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.071700 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.076131 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.105131 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg"] Mar 10 00:11:16 crc kubenswrapper[4994]: E0310 00:11:16.105518 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ffb940-ad05-42fe-99dc-2ca36481a566" containerName="route-controller-manager" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.105597 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ffb940-ad05-42fe-99dc-2ca36481a566" containerName="route-controller-manager" Mar 10 00:11:16 crc kubenswrapper[4994]: E0310 00:11:16.105672 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7fc03e-d1af-479d-9315-0f25283f3aa1" containerName="pruner" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.105725 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7fc03e-d1af-479d-9315-0f25283f3aa1" containerName="pruner" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.105883 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b7fc03e-d1af-479d-9315-0f25283f3aa1" containerName="pruner" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.105954 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ffb940-ad05-42fe-99dc-2ca36481a566" containerName="route-controller-manager" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.106380 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.117985 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg"] Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.154941 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-client-ca\") pod \"f9ffb940-ad05-42fe-99dc-2ca36481a566\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.155042 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r94jz\" (UniqueName: \"kubernetes.io/projected/f9ffb940-ad05-42fe-99dc-2ca36481a566-kube-api-access-r94jz\") pod \"f9ffb940-ad05-42fe-99dc-2ca36481a566\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.155110 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ffb940-ad05-42fe-99dc-2ca36481a566-serving-cert\") pod \"f9ffb940-ad05-42fe-99dc-2ca36481a566\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.155143 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kube-api-access\") pod \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.155209 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-config\") pod \"f9ffb940-ad05-42fe-99dc-2ca36481a566\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.155234 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kubelet-dir\") pod \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.155525 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3b7fc03e-d1af-479d-9315-0f25283f3aa1" (UID: "3b7fc03e-d1af-479d-9315-0f25283f3aa1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.156351 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-client-ca" (OuterVolumeSpecName: "client-ca") pod "f9ffb940-ad05-42fe-99dc-2ca36481a566" (UID: "f9ffb940-ad05-42fe-99dc-2ca36481a566"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.160225 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-config" (OuterVolumeSpecName: "config") pod "f9ffb940-ad05-42fe-99dc-2ca36481a566" (UID: "f9ffb940-ad05-42fe-99dc-2ca36481a566"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.162647 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ffb940-ad05-42fe-99dc-2ca36481a566-kube-api-access-r94jz" (OuterVolumeSpecName: "kube-api-access-r94jz") pod "f9ffb940-ad05-42fe-99dc-2ca36481a566" (UID: "f9ffb940-ad05-42fe-99dc-2ca36481a566"). InnerVolumeSpecName "kube-api-access-r94jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.164742 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ffb940-ad05-42fe-99dc-2ca36481a566-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f9ffb940-ad05-42fe-99dc-2ca36481a566" (UID: "f9ffb940-ad05-42fe-99dc-2ca36481a566"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.166501 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3b7fc03e-d1af-479d-9315-0f25283f3aa1" (UID: "3b7fc03e-d1af-479d-9315-0f25283f3aa1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256091 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-client-ca\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256141 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-config\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256159 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3653335d-178c-4df8-a93d-4d19011298fe-serving-cert\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256185 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8zbr\" (UniqueName: \"kubernetes.io/projected/3653335d-178c-4df8-a93d-4d19011298fe-kube-api-access-f8zbr\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256238 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r94jz\" (UniqueName: \"kubernetes.io/projected/f9ffb940-ad05-42fe-99dc-2ca36481a566-kube-api-access-r94jz\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256249 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ffb940-ad05-42fe-99dc-2ca36481a566-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256258 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256268 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256277 4994 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256284 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.357940 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-client-ca\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.358004 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-config\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.358027 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3653335d-178c-4df8-a93d-4d19011298fe-serving-cert\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.358067 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8zbr\" (UniqueName: \"kubernetes.io/projected/3653335d-178c-4df8-a93d-4d19011298fe-kube-api-access-f8zbr\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.359918 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-client-ca\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.361474 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-config\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.363824 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3653335d-178c-4df8-a93d-4d19011298fe-serving-cert\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.379484 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8zbr\" (UniqueName: \"kubernetes.io/projected/3653335d-178c-4df8-a93d-4d19011298fe-kube-api-access-f8zbr\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.444288 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.794398 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3b7fc03e-d1af-479d-9315-0f25283f3aa1","Type":"ContainerDied","Data":"561b9844b5f034c62f159c5807ee8bec23ed39a360cd1fa632787a672e2ff7e3"} Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.794458 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="561b9844b5f034c62f159c5807ee8bec23ed39a360cd1fa632787a672e2ff7e3" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.794540 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.798209 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" event={"ID":"f9ffb940-ad05-42fe-99dc-2ca36481a566","Type":"ContainerDied","Data":"0ec14e3f57f4022c2cac12e0b6ad39a5f8eadbca12ac88bfd6e35f901488d9ac"} Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.798273 4994 scope.go:117] "RemoveContainer" containerID="3ffee0ac703d08dbee3e08fe1458ad708a9d5512f365eb71414438145464290e" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.798410 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.833488 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs"] Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.840740 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs"] Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.971317 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg"] Mar 10 00:11:17 crc kubenswrapper[4994]: I0310 00:11:17.808466 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" event={"ID":"3653335d-178c-4df8-a93d-4d19011298fe","Type":"ContainerStarted","Data":"86f6391bf2290461b2367cbde0871fb0815774ff3d6099505b8889c9a6ec884a"} Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.574007 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ffb940-ad05-42fe-99dc-2ca36481a566" path="/var/lib/kubelet/pods/f9ffb940-ad05-42fe-99dc-2ca36481a566/volumes" Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.820854 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" event={"ID":"3653335d-178c-4df8-a93d-4d19011298fe","Type":"ContainerStarted","Data":"fb93dc31c2039db01d5ef603fb8cadf69dce66a5580028d2b9b19cc49d45e88a"} Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.892245 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.892576 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.892775 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.895228 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.895398 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6" gracePeriod=600 Mar 10 00:11:19 crc kubenswrapper[4994]: I0310 00:11:19.831357 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6" exitCode=0 Mar 10 00:11:19 crc kubenswrapper[4994]: I0310 00:11:19.831578 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6"} Mar 10 00:11:19 crc kubenswrapper[4994]: I0310 00:11:19.832434 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:19 crc kubenswrapper[4994]: I0310 00:11:19.844470 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:19 crc kubenswrapper[4994]: I0310 00:11:19.859482 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" podStartSLOduration=38.859465542 podStartE2EDuration="38.859465542s" podCreationTimestamp="2026-03-10 00:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:11:19.855782768 +0000 UTC m=+294.029489557" watchObservedRunningTime="2026-03-10 00:11:19.859465542 +0000 UTC m=+294.033172291" Mar 10 00:11:22 crc kubenswrapper[4994]: I0310 00:11:22.499910 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:22 crc kubenswrapper[4994]: I0310 00:11:22.499997 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:22 crc kubenswrapper[4994]: I0310 00:11:22.500394 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:22 crc kubenswrapper[4994]: I0310 00:11:22.500454 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:23 crc kubenswrapper[4994]: I0310 00:11:23.865779 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"e0758e06b050c72a5ba1ca15578add547da69884a82997478d49f051fd653d6f"} Mar 10 00:11:24 crc kubenswrapper[4994]: I0310 00:11:24.875822 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerStarted","Data":"68b888f4cbe6324465ef819f044ee90195b5c11bf20ab3d53a6faef3352a36c6"} Mar 10 00:11:24 crc kubenswrapper[4994]: I0310 00:11:24.919526 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wpd8k" podStartSLOduration=9.879826556 podStartE2EDuration="1m19.919505402s" podCreationTimestamp="2026-03-10 00:10:05 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.443648073 +0000 UTC m=+226.617354832" lastFinishedPulling="2026-03-10 00:11:22.483326899 +0000 UTC m=+296.657033678" observedRunningTime="2026-03-10 00:11:24.916090126 +0000 UTC m=+299.089796885" watchObservedRunningTime="2026-03-10 00:11:24.919505402 +0000 UTC m=+299.093212161" Mar 10 00:11:25 crc kubenswrapper[4994]: I0310 00:11:25.954323 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:11:25 crc kubenswrapper[4994]: I0310 00:11:25.954412 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:11:28 crc kubenswrapper[4994]: I0310 00:11:28.828301 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wpd8k" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="registry-server" probeResult="failure" output=< Mar 10 00:11:28 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:11:28 crc kubenswrapper[4994]: > Mar 10 00:11:32 crc kubenswrapper[4994]: I0310 00:11:32.499300 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:32 crc kubenswrapper[4994]: I0310 00:11:32.499300 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:32 crc kubenswrapper[4994]: I0310 00:11:32.499679 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:32 crc kubenswrapper[4994]: I0310 00:11:32.499710 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:36 crc kubenswrapper[4994]: I0310 00:11:36.949973 4994 generic.go:334] "Generic (PLEG): container finished" podID="0779a70e-ebf5-4e98-87ea-43017b8d1e46" containerID="a4cfa8b96c6aa5123624fb879c0f68820a0d96a764fb960d5b7561f433ae5dad" exitCode=0 Mar 10 00:11:36 crc kubenswrapper[4994]: I0310 00:11:36.950680 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29551680-sz8pz" event={"ID":"0779a70e-ebf5-4e98-87ea-43017b8d1e46","Type":"ContainerDied","Data":"a4cfa8b96c6aa5123624fb879c0f68820a0d96a764fb960d5b7561f433ae5dad"} Mar 10 00:11:37 crc kubenswrapper[4994]: I0310 00:11:37.010137 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wpd8k" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="registry-server" probeResult="failure" output=< Mar 10 00:11:37 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:11:37 crc kubenswrapper[4994]: > Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.738438 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.828240 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0779a70e-ebf5-4e98-87ea-43017b8d1e46-serviceca\") pod \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.828313 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zmr8\" (UniqueName: \"kubernetes.io/projected/0779a70e-ebf5-4e98-87ea-43017b8d1e46-kube-api-access-2zmr8\") pod \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.829585 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0779a70e-ebf5-4e98-87ea-43017b8d1e46-serviceca" (OuterVolumeSpecName: "serviceca") pod "0779a70e-ebf5-4e98-87ea-43017b8d1e46" (UID: "0779a70e-ebf5-4e98-87ea-43017b8d1e46"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.835714 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0779a70e-ebf5-4e98-87ea-43017b8d1e46-kube-api-access-2zmr8" (OuterVolumeSpecName: "kube-api-access-2zmr8") pod "0779a70e-ebf5-4e98-87ea-43017b8d1e46" (UID: "0779a70e-ebf5-4e98-87ea-43017b8d1e46"). InnerVolumeSpecName "kube-api-access-2zmr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.928901 4994 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0779a70e-ebf5-4e98-87ea-43017b8d1e46-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.928937 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zmr8\" (UniqueName: \"kubernetes.io/projected/0779a70e-ebf5-4e98-87ea-43017b8d1e46-kube-api-access-2zmr8\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.966556 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29551680-sz8pz" event={"ID":"0779a70e-ebf5-4e98-87ea-43017b8d1e46","Type":"ContainerDied","Data":"b48d5517182b7a9629abeaefea2ce0d25137af9372c04a623cb57c0cb0fada84"} Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.966589 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b48d5517182b7a9629abeaefea2ce0d25137af9372c04a623cb57c0cb0fada84" Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.966639 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.499674 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.500025 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.500071 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.500552 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.500604 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"657187cd3846e2eb040a978c60bc565d95b249e43315c01390881e18407946eb"} pod="openshift-console/downloads-7954f5f757-8lrmb" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.500634 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" containerID="cri-o://657187cd3846e2eb040a978c60bc565d95b249e43315c01390881e18407946eb" gracePeriod=2 Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.500626 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.501711 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.501737 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:44 crc kubenswrapper[4994]: I0310 00:11:44.008378 4994 generic.go:334] "Generic (PLEG): container finished" podID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerID="657187cd3846e2eb040a978c60bc565d95b249e43315c01390881e18407946eb" exitCode=0 Mar 10 00:11:44 crc kubenswrapper[4994]: I0310 00:11:44.008475 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lrmb" event={"ID":"4fb67636-fcba-4975-a460-403cd6ee9c25","Type":"ContainerDied","Data":"657187cd3846e2eb040a978c60bc565d95b249e43315c01390881e18407946eb"} Mar 10 00:11:44 crc kubenswrapper[4994]: I0310 00:11:44.008714 4994 scope.go:117] "RemoveContainer" containerID="131f6f968be699b4510e1711ff70f7d98fd24e9b749c0ac094982fa64eb070f5" Mar 10 00:11:46 crc kubenswrapper[4994]: I0310 00:11:46.135748 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:11:46 crc kubenswrapper[4994]: I0310 00:11:46.186475 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:11:48 crc kubenswrapper[4994]: I0310 00:11:48.037267 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerStarted","Data":"ea2fcc0ccbdd2d99bdd5e8db5934d29568b2c080e467f0b26270d4267b4ac275"} Mar 10 00:11:48 crc kubenswrapper[4994]: I0310 00:11:48.039399 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lrmb" event={"ID":"4fb67636-fcba-4975-a460-403cd6ee9c25","Type":"ContainerStarted","Data":"880d3d4f21f0a68a2d906d11b1bffa48bdffb339f9de23fd891a9fb4f152b67d"} Mar 10 00:11:48 crc kubenswrapper[4994]: I0310 00:11:48.039747 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:11:48 crc kubenswrapper[4994]: I0310 00:11:48.040256 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:48 crc kubenswrapper[4994]: I0310 00:11:48.040346 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.056316 4994 generic.go:334] "Generic (PLEG): container finished" podID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerID="8a4cdb4758a8d66ac4d964d75c363e426be3ea4f0d96bd2b4370bc01dbce1a3f" exitCode=0 Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.056438 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzrd2" event={"ID":"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c","Type":"ContainerDied","Data":"8a4cdb4758a8d66ac4d964d75c363e426be3ea4f0d96bd2b4370bc01dbce1a3f"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.059387 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerStarted","Data":"979b74ea5a85ff4f5cee3a5418901bcd485b5c3af1206f8500f2ce239b83bc17"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.065813 4994 generic.go:334] "Generic (PLEG): container finished" podID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerID="6e999aa11768c350f73b98e59c5adafa0716222aea76f3c3cc4ced602c5932bf" exitCode=0 Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.065940 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zv2kt" event={"ID":"76aa065c-ed60-4237-b36f-5ce2865256ff","Type":"ContainerDied","Data":"6e999aa11768c350f73b98e59c5adafa0716222aea76f3c3cc4ced602c5932bf"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.070071 4994 generic.go:334] "Generic (PLEG): container finished" podID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerID="5fa6127d5cb315c05287e97611be0a26bc929ad831b3970419c02c806f804ed6" exitCode=0 Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.070146 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrh9x" event={"ID":"a4a4dc2d-502f-4c05-ab76-1cc708f13006","Type":"ContainerDied","Data":"5fa6127d5cb315c05287e97611be0a26bc929ad831b3970419c02c806f804ed6"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.096585 4994 generic.go:334] "Generic (PLEG): container finished" podID="abe30cce-8379-4db8-838b-f48b4bc96621" containerID="ea2fcc0ccbdd2d99bdd5e8db5934d29568b2c080e467f0b26270d4267b4ac275" exitCode=0 Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.096660 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerDied","Data":"ea2fcc0ccbdd2d99bdd5e8db5934d29568b2c080e467f0b26270d4267b4ac275"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.098744 4994 generic.go:334] "Generic (PLEG): container finished" podID="fdad0261-804d-41dc-8a25-48018f136c0f" containerID="2ed048e8f43bfa8a5d112e7ab569e89e26c34e5ab5b69b6c77b3a42aea54c386" exitCode=0 Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.098822 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwzk5" event={"ID":"fdad0261-804d-41dc-8a25-48018f136c0f","Type":"ContainerDied","Data":"2ed048e8f43bfa8a5d112e7ab569e89e26c34e5ab5b69b6c77b3a42aea54c386"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.101558 4994 generic.go:334] "Generic (PLEG): container finished" podID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerID="757e8587473ddc7f21a46feaf304e66dbe443eeebcd4628d091bf2c8bec511d9" exitCode=0 Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.102221 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4tz9" event={"ID":"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5","Type":"ContainerDied","Data":"757e8587473ddc7f21a46feaf304e66dbe443eeebcd4628d091bf2c8bec511d9"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.102638 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.102663 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.965284 4994 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.965745 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0779a70e-ebf5-4e98-87ea-43017b8d1e46" containerName="image-pruner" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.965772 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="0779a70e-ebf5-4e98-87ea-43017b8d1e46" containerName="image-pruner" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.966131 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="0779a70e-ebf5-4e98-87ea-43017b8d1e46" containerName="image-pruner" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.966841 4994 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.967029 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.967466 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b" gracePeriod=15 Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.967591 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c" gracePeriod=15 Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.967694 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0" gracePeriod=15 Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.967562 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04" gracePeriod=15 Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.967663 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5" gracePeriod=15 Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968486 4994 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968729 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968755 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968774 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968788 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968803 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968816 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968829 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968842 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968855 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968867 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968912 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968926 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968946 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968960 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968980 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968992 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.969013 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969025 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969230 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969251 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969274 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969344 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969371 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969395 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969417 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969434 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969452 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.969639 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969659 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.017141 4994 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.100692 4994 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.101402 4994 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.102158 4994 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.102534 4994 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.104800 4994 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.104868 4994 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.105540 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="200ms" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106025 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106083 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106107 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106135 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106159 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106214 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106274 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106324 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208396 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208505 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208543 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208583 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208609 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208621 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208683 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208706 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208760 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208741 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208997 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.209050 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208776 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208737 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.209156 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.209316 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.306949 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="400ms" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.318519 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: W0310 00:11:51.359792 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-18a5e48ee962680b8c06a3e5a5396e0f805c9148a02be8cc74fe52d50407bdb6 WatchSource:0}: Error finding container 18a5e48ee962680b8c06a3e5a5396e0f805c9148a02be8cc74fe52d50407bdb6: Status 404 returned error can't find the container with id 18a5e48ee962680b8c06a3e5a5396e0f805c9148a02be8cc74fe52d50407bdb6 Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.367203 4994 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b526e27adea94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:11:51.36601154 +0000 UTC m=+325.539718299,LastTimestamp:2026-03-10 00:11:51.36601154 +0000 UTC m=+325.539718299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.708455 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="800ms" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.109499 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:11:52Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:11:52Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:11:52Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:11:52Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2ead96dfa4a455fb7a5f837dda0a6a313f0e14d9f3c87803c59b404dd20bb8c9\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4b84875eedbc5bc73b3c7db057dd8a31dc057b3bc3800c363d96061647d5542e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733466077},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:31f4b29550b003f4be97158173a6848610a5e1f2d75ec55a8f08e1290bea5743\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:8ce9cb8700109aeb73f9cdf0faff20b28b7b12065192e35305a606dfe03096f9\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1277641434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:c58b039abb143f3d0ca40a35e68f945fc7e550a9768e6d1e501423c8b084cbe1\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:c87443452256f577ba60dbd9f77a070d264a5f4cd3924036fbc17c0665052272\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221223632},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.110633 4994 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.110681 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.110819 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.111116 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.111583 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.112331 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.112357 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.128615 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f159c5160bbebc2d55cd9bb33ea390e800dbff7ea9620c436631139ca88c6b3b"} Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.128663 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"18a5e48ee962680b8c06a3e5a5396e0f805c9148a02be8cc74fe52d50407bdb6"} Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.130451 4994 generic.go:334] "Generic (PLEG): container finished" podID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" containerID="9048ce472e16494b257b9311b0a7e6443400f3bafde71adbbc641d8d3eac2afb" exitCode=0 Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.130529 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e7e1505e-1226-47f8-8e43-6ac30a4ff867","Type":"ContainerDied","Data":"9048ce472e16494b257b9311b0a7e6443400f3bafde71adbbc641d8d3eac2afb"} Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.131281 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.131702 4994 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.133715 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.135724 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.136707 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0" exitCode=0 Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.136732 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04" exitCode=0 Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.136744 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c" exitCode=0 Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.136756 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5" exitCode=2 Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.136794 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.499612 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.499685 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.499739 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.499764 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.509473 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="1.6s" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.437490 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.439110 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552413 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kubelet-dir\") pod \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552483 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-var-lock\") pod \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552520 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e7e1505e-1226-47f8-8e43-6ac30a4ff867" (UID: "e7e1505e-1226-47f8-8e43-6ac30a4ff867"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552559 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-var-lock" (OuterVolumeSpecName: "var-lock") pod "e7e1505e-1226-47f8-8e43-6ac30a4ff867" (UID: "e7e1505e-1226-47f8-8e43-6ac30a4ff867"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552613 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kube-api-access\") pod \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552825 4994 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552841 4994 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.560125 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e1505e-1226-47f8-8e43-6ac30a4ff867" (UID: "e7e1505e-1226-47f8-8e43-6ac30a4ff867"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.653571 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:54 crc kubenswrapper[4994]: E0310 00:11:54.109962 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="3.2s" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.158679 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.160524 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b" exitCode=0 Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.162642 4994 generic.go:334] "Generic (PLEG): container finished" podID="0429fae4-1356-4d61-86a3-267f74f27636" containerID="979b74ea5a85ff4f5cee3a5418901bcd485b5c3af1206f8500f2ce239b83bc17" exitCode=0 Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.162701 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerDied","Data":"979b74ea5a85ff4f5cee3a5418901bcd485b5c3af1206f8500f2ce239b83bc17"} Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.164465 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.164643 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.164707 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e7e1505e-1226-47f8-8e43-6ac30a4ff867","Type":"ContainerDied","Data":"929c6f36bcb8cd761948647dfebae867393436e978350669dcfa51d6e35aaf48"} Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.164768 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="929c6f36bcb8cd761948647dfebae867393436e978350669dcfa51d6e35aaf48" Mar 10 00:11:54 crc kubenswrapper[4994]: E0310 00:11:54.165113 4994 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.169037 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.170450 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.171728 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.192536 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.192939 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:54 crc kubenswrapper[4994]: E0310 00:11:54.651260 4994 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" volumeName="registry-storage" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.176346 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.177184 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c46f93fa4d73cf6270f364f11272df6b45c18596808880a68c78370736f59ee" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.198813 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.199506 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.200027 4994 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.200208 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.200458 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.281401 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.281556 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.281610 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.281945 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.281988 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.282009 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.383215 4994 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.383249 4994 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.383258 4994 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.185539 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.211850 4994 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.212405 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.212691 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.558780 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.560188 4994 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.560567 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.586852 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 10 00:11:57 crc kubenswrapper[4994]: E0310 00:11:57.310863 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="6.4s" Mar 10 00:11:57 crc kubenswrapper[4994]: E0310 00:11:57.690848 4994 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b526e27adea94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:11:51.36601154 +0000 UTC m=+325.539718299,LastTimestamp:2026-03-10 00:11:51.36601154 +0000 UTC m=+325.539718299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:12:02 crc kubenswrapper[4994]: E0310 00:12:02.207434 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:02Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:02Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:02Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:02Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2ead96dfa4a455fb7a5f837dda0a6a313f0e14d9f3c87803c59b404dd20bb8c9\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4b84875eedbc5bc73b3c7db057dd8a31dc057b3bc3800c363d96061647d5542e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733466077},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:31f4b29550b003f4be97158173a6848610a5e1f2d75ec55a8f08e1290bea5743\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:8ce9cb8700109aeb73f9cdf0faff20b28b7b12065192e35305a606dfe03096f9\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1277641434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:c58b039abb143f3d0ca40a35e68f945fc7e550a9768e6d1e501423c8b084cbe1\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:c87443452256f577ba60dbd9f77a070d264a5f4cd3924036fbc17c0665052272\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221223632},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:02 crc kubenswrapper[4994]: E0310 00:12:02.209199 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:02 crc kubenswrapper[4994]: E0310 00:12:02.209567 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:02 crc kubenswrapper[4994]: E0310 00:12:02.209938 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:02 crc kubenswrapper[4994]: E0310 00:12:02.210257 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:02 crc kubenswrapper[4994]: E0310 00:12:02.210286 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:12:02 crc kubenswrapper[4994]: I0310 00:12:02.499173 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:12:02 crc kubenswrapper[4994]: I0310 00:12:02.499255 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:12:02 crc kubenswrapper[4994]: I0310 00:12:02.499336 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:12:02 crc kubenswrapper[4994]: I0310 00:12:02.499446 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:12:03 crc kubenswrapper[4994]: E0310 00:12:03.711782 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="7s" Mar 10 00:12:04 crc kubenswrapper[4994]: I0310 00:12:04.553845 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:04 crc kubenswrapper[4994]: I0310 00:12:04.555678 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:04 crc kubenswrapper[4994]: I0310 00:12:04.556521 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:04 crc kubenswrapper[4994]: I0310 00:12:04.570580 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:04 crc kubenswrapper[4994]: I0310 00:12:04.570788 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:04 crc kubenswrapper[4994]: E0310 00:12:04.572379 4994 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:04 crc kubenswrapper[4994]: I0310 00:12:04.573331 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:05 crc kubenswrapper[4994]: I0310 00:12:05.966138 4994 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 10 00:12:05 crc kubenswrapper[4994]: I0310 00:12:05.966461 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 10 00:12:06 crc kubenswrapper[4994]: I0310 00:12:06.561644 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:06 crc kubenswrapper[4994]: I0310 00:12:06.562287 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:06 crc kubenswrapper[4994]: I0310 00:12:06.562759 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.262041 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.262793 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.262863 4994 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6" exitCode=1 Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.262931 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6"} Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.263651 4994 scope.go:117] "RemoveContainer" containerID="597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.263710 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.264175 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.264736 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.265271 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:07 crc kubenswrapper[4994]: E0310 00:12:07.692674 4994 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b526e27adea94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:11:51.36601154 +0000 UTC m=+325.539718299,LastTimestamp:2026-03-10 00:11:51.36601154 +0000 UTC m=+325.539718299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.330303 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4tz9" event={"ID":"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5","Type":"ContainerStarted","Data":"c85cb3939718589608e5c84bc5f793f0ac91554e53660eb93bb8216dc7e11be6"} Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.333130 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.333638 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.334104 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.334605 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.335055 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.799795 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:12:10 crc kubenswrapper[4994]: E0310 00:12:10.713477 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="7s" Mar 10 00:12:12 crc kubenswrapper[4994]: E0310 00:12:12.293585 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:12Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:12Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:12Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:12Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2ead96dfa4a455fb7a5f837dda0a6a313f0e14d9f3c87803c59b404dd20bb8c9\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4b84875eedbc5bc73b3c7db057dd8a31dc057b3bc3800c363d96061647d5542e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733466077},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:31f4b29550b003f4be97158173a6848610a5e1f2d75ec55a8f08e1290bea5743\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:8ce9cb8700109aeb73f9cdf0faff20b28b7b12065192e35305a606dfe03096f9\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1277641434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:c58b039abb143f3d0ca40a35e68f945fc7e550a9768e6d1e501423c8b084cbe1\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:c87443452256f577ba60dbd9f77a070d264a5f4cd3924036fbc17c0665052272\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221223632},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: E0310 00:12:12.294806 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: E0310 00:12:12.295314 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: E0310 00:12:12.295735 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: E0310 00:12:12.296184 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: E0310 00:12:12.296216 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:12:12 crc kubenswrapper[4994]: W0310 00:12:12.514771 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-7979a6518c99bce5683dee67e99778ebe73e569747a5271d1eb921be7fb9dd47 WatchSource:0}: Error finding container 7979a6518c99bce5683dee67e99778ebe73e569747a5271d1eb921be7fb9dd47: Status 404 returned error can't find the container with id 7979a6518c99bce5683dee67e99778ebe73e569747a5271d1eb921be7fb9dd47 Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.518861 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.519625 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.520221 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.520650 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.521312 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.523157 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.523776 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.169779 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.169842 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.239604 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.240705 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.241189 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.241812 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.242345 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.242775 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.243263 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.363186 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zv2kt" event={"ID":"76aa065c-ed60-4237-b36f-5ce2865256ff","Type":"ContainerStarted","Data":"ed0179eebea68ea16a7ca2db77939a49213c1a39d735deda526e14363858855f"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.366024 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrh9x" event={"ID":"a4a4dc2d-502f-4c05-ab76-1cc708f13006","Type":"ContainerStarted","Data":"0872f65b4051bd51ddef4f02f90e17a29e60ddaf92da0bcd501b161644707129"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.367253 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.367617 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.367974 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.368587 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.369142 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.369625 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.369891 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.370426 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerStarted","Data":"06150f7e8e87fb17f0464494725fda3c7ecd07c48589662349d651d3700f6139"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.370830 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.370922 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.370985 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.371333 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.371606 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.371858 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.372148 4994 status_manager.go:851] "Failed to get status for pod" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" pod="openshift-marketplace/community-operators-s7qcn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s7qcn\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.372433 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.372776 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.373093 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwzk5" event={"ID":"fdad0261-804d-41dc-8a25-48018f136c0f","Type":"ContainerStarted","Data":"8050e9fca1f15bb15fdfd5da3939a33eede922d4e417035502ed98660f52b965"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.373262 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.397936 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.399335 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.399799 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4775d9148386aa6d7bcab367446c2763501cb2fb0bab1d51b2917349e4a84821"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.403036 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.403995 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.404529 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3e49e18983f0b96a449876dfb330dd4ce7814743360f0e56be0661919de9264d"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.404623 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7979a6518c99bce5683dee67e99778ebe73e569747a5271d1eb921be7fb9dd47"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.404764 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.404999 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.405261 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.405607 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.405974 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.406278 4994 status_manager.go:851] "Failed to get status for pod" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" pod="openshift-marketplace/community-operators-s7qcn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s7qcn\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.407323 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzrd2" event={"ID":"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c","Type":"ContainerStarted","Data":"4ac24c0285d02489c6991b060b56e4de350d7660656ffb13b6c269d046020cdf"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.409684 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerStarted","Data":"ae53ee029113a53cb7678f130696b300909ff84d74bb704709a978c1c97a24b4"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.410550 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.410824 4994 status_manager.go:851] "Failed to get status for pod" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" pod="openshift-marketplace/community-operators-s7qcn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s7qcn\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.413989 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.414221 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.414389 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.414555 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.414728 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.414923 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.459606 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.460326 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.460723 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.461345 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.461703 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.461988 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.462271 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.462543 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.462845 4994 status_manager.go:851] "Failed to get status for pod" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" pod="openshift-marketplace/community-operators-s7qcn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s7qcn\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.785685 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.417549 4994 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3e49e18983f0b96a449876dfb330dd4ce7814743360f0e56be0661919de9264d" exitCode=0 Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.417592 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3e49e18983f0b96a449876dfb330dd4ce7814743360f0e56be0661919de9264d"} Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.417853 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.418287 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.418158 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: E0310 00:12:14.418752 4994 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.418762 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.419152 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.419365 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.419618 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.420180 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.420534 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.421054 4994 status_manager.go:851] "Failed to get status for pod" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" pod="openshift-marketplace/community-operators-s7qcn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s7qcn\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.421442 4994 status_manager.go:851] "Failed to get status for pod" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" pod="openshift-marketplace/community-operators-zv2kt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zv2kt\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.421707 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.422068 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-s7qcn" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="registry-server" probeResult="failure" output=< Mar 10 00:12:14 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:12:14 crc kubenswrapper[4994]: > Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.422100 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.422611 4994 status_manager.go:851] "Failed to get status for pod" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" pod="openshift-marketplace/community-operators-s7qcn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s7qcn\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.422914 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.423188 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.423460 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.423775 4994 status_manager.go:851] "Failed to get status for pod" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" pod="openshift-marketplace/certified-operators-bwzk5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bwzk5\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.424123 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.424398 4994 status_manager.go:851] "Failed to get status for pod" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" pod="openshift-marketplace/redhat-marketplace-bzrd2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bzrd2\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.424725 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:15 crc kubenswrapper[4994]: I0310 00:12:15.007486 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:12:15 crc kubenswrapper[4994]: I0310 00:12:15.007911 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:12:15 crc kubenswrapper[4994]: I0310 00:12:15.381359 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:12:15 crc kubenswrapper[4994]: I0310 00:12:15.381467 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:12:16 crc kubenswrapper[4994]: I0310 00:12:16.047167 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-hrh9x" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="registry-server" probeResult="failure" output=< Mar 10 00:12:16 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:12:16 crc kubenswrapper[4994]: > Mar 10 00:12:16 crc kubenswrapper[4994]: I0310 00:12:16.356206 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:12:16 crc kubenswrapper[4994]: I0310 00:12:16.356322 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:12:16 crc kubenswrapper[4994]: I0310 00:12:16.448818 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-bzrd2" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="registry-server" probeResult="failure" output=< Mar 10 00:12:16 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:12:16 crc kubenswrapper[4994]: > Mar 10 00:12:16 crc kubenswrapper[4994]: I0310 00:12:16.454749 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5aeba6f8d1fc3d56c8e40a470bacd9595ee0ce5128539e66674f577f8cf699d5"} Mar 10 00:12:17 crc kubenswrapper[4994]: I0310 00:12:17.422354 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t5kj4" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="registry-server" probeResult="failure" output=< Mar 10 00:12:17 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:12:17 crc kubenswrapper[4994]: > Mar 10 00:12:18 crc kubenswrapper[4994]: I0310 00:12:18.470734 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a5c7ed696570aa0f3b3a821017b75c4e650df498798189c2c3a5989dfb0673a1"} Mar 10 00:12:19 crc kubenswrapper[4994]: I0310 00:12:19.799449 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:12:19 crc kubenswrapper[4994]: I0310 00:12:19.804915 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:12:20 crc kubenswrapper[4994]: I0310 00:12:20.491055 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d8ac9c96642120235e24de181f59530c330fdc1abb013c500614df8f311a6af1"} Mar 10 00:12:22 crc kubenswrapper[4994]: I0310 00:12:22.511504 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7681f4b847501f8824d526a1f4e3d9f91788f9203b77c819a471dc9828d9c67f"} Mar 10 00:12:22 crc kubenswrapper[4994]: I0310 00:12:22.758618 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:12:22 crc kubenswrapper[4994]: I0310 00:12:22.758718 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:12:22 crc kubenswrapper[4994]: I0310 00:12:22.836472 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:12:22 crc kubenswrapper[4994]: I0310 00:12:22.999473 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:12:22 crc kubenswrapper[4994]: I0310 00:12:22.999563 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.086727 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.423599 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.492267 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.525570 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.525613 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.525903 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a276672f8df37350ac03bef30c33aaff113a566814c0b2b9e9da731500296641"} Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.527511 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.535774 4994 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.544776 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aeba6f8d1fc3d56c8e40a470bacd9595ee0ce5128539e66674f577f8cf699d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8ac9c96642120235e24de181f59530c330fdc1abb013c500614df8f311a6af1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c7ed696570aa0f3b3a821017b75c4e650df498798189c2c3a5989dfb0673a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a276672f8df37350ac03bef30c33aaff113a566814c0b2b9e9da731500296641\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:12:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7681f4b847501f8824d526a1f4e3d9f91788f9203b77c819a471dc9828d9c67f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:12:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": pods \"kube-apiserver-crc\" not found" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.587783 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.595167 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.611774 4994 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e132a8ed-7c07-4ab3-8467-4370b175763c" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.793357 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:12:24 crc kubenswrapper[4994]: I0310 00:12:24.542850 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:24 crc kubenswrapper[4994]: I0310 00:12:24.543223 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:24 crc kubenswrapper[4994]: I0310 00:12:24.573454 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:24 crc kubenswrapper[4994]: I0310 00:12:24.573521 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:24 crc kubenswrapper[4994]: I0310 00:12:24.580787 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.075083 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.147772 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.450503 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.525497 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.551411 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.551455 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.562528 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.567388 4994 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e132a8ed-7c07-4ab3-8467-4370b175763c" Mar 10 00:12:26 crc kubenswrapper[4994]: I0310 00:12:26.422244 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:12:26 crc kubenswrapper[4994]: I0310 00:12:26.490043 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:12:26 crc kubenswrapper[4994]: I0310 00:12:26.567099 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:26 crc kubenswrapper[4994]: I0310 00:12:26.567133 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:27 crc kubenswrapper[4994]: I0310 00:12:27.568340 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:27 crc kubenswrapper[4994]: I0310 00:12:27.568381 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:34 crc kubenswrapper[4994]: I0310 00:12:34.583972 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:34 crc kubenswrapper[4994]: I0310 00:12:34.585398 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:34 crc kubenswrapper[4994]: I0310 00:12:34.585432 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:36 crc kubenswrapper[4994]: I0310 00:12:36.600315 4994 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e132a8ed-7c07-4ab3-8467-4370b175763c" Mar 10 00:12:50 crc kubenswrapper[4994]: I0310 00:12:50.523901 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 00:12:51 crc kubenswrapper[4994]: I0310 00:12:51.243363 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 00:12:52 crc kubenswrapper[4994]: I0310 00:12:52.230657 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 00:12:52 crc kubenswrapper[4994]: I0310 00:12:52.269775 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:12:52 crc kubenswrapper[4994]: I0310 00:12:52.611463 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 00:12:53 crc kubenswrapper[4994]: I0310 00:12:53.824900 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 00:12:53 crc kubenswrapper[4994]: I0310 00:12:53.922173 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 00:12:54 crc kubenswrapper[4994]: I0310 00:12:54.066629 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 00:12:54 crc kubenswrapper[4994]: I0310 00:12:54.647824 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 00:12:54 crc kubenswrapper[4994]: I0310 00:12:54.758540 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 00:12:54 crc kubenswrapper[4994]: I0310 00:12:54.839584 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 00:12:55 crc kubenswrapper[4994]: I0310 00:12:55.122730 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 00:12:55 crc kubenswrapper[4994]: I0310 00:12:55.529442 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 00:12:55 crc kubenswrapper[4994]: I0310 00:12:55.581338 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 00:12:55 crc kubenswrapper[4994]: I0310 00:12:55.654593 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 00:12:56 crc kubenswrapper[4994]: I0310 00:12:56.024750 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 00:12:56 crc kubenswrapper[4994]: I0310 00:12:56.525347 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 00:12:56 crc kubenswrapper[4994]: I0310 00:12:56.633186 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:12:56 crc kubenswrapper[4994]: I0310 00:12:56.698488 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 00:12:56 crc kubenswrapper[4994]: I0310 00:12:56.785272 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 00:12:56 crc kubenswrapper[4994]: I0310 00:12:56.820447 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.138482 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.231972 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.337524 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.372518 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.383534 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.688425 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.700832 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.800928 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.838382 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.838719 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.913495 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.031531 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.311062 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.312369 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.383008 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.770630 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.835420 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.840919 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.920043 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 00:12:59 crc kubenswrapper[4994]: I0310 00:12:59.279974 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 00:12:59 crc kubenswrapper[4994]: I0310 00:12:59.448837 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 00:12:59 crc kubenswrapper[4994]: I0310 00:12:59.607166 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 00:12:59 crc kubenswrapper[4994]: I0310 00:12:59.779028 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 00:12:59 crc kubenswrapper[4994]: I0310 00:12:59.825060 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.073992 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.139740 4994 generic.go:334] "Generic (PLEG): container finished" podID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerID="8746dfac88fe15ceee6b052e166cd28bfb74ffbe54cfe6f00bf09d8a12e889fd" exitCode=0 Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.139809 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" event={"ID":"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd","Type":"ContainerDied","Data":"8746dfac88fe15ceee6b052e166cd28bfb74ffbe54cfe6f00bf09d8a12e889fd"} Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.140517 4994 scope.go:117] "RemoveContainer" containerID="8746dfac88fe15ceee6b052e166cd28bfb74ffbe54cfe6f00bf09d8a12e889fd" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.218072 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.370050 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.461102 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.521280 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.559166 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.649999 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.756778 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.844505 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.012727 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.088494 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.156169 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/1.log" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.156895 4994 generic.go:334] "Generic (PLEG): container finished" podID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerID="7bce6eb584bb3c8177c72c93b8bc60cfd8583faf52e9d74e3b32955e8a8583bd" exitCode=1 Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.156942 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" event={"ID":"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd","Type":"ContainerDied","Data":"7bce6eb584bb3c8177c72c93b8bc60cfd8583faf52e9d74e3b32955e8a8583bd"} Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.157234 4994 scope.go:117] "RemoveContainer" containerID="8746dfac88fe15ceee6b052e166cd28bfb74ffbe54cfe6f00bf09d8a12e889fd" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.157978 4994 scope.go:117] "RemoveContainer" containerID="7bce6eb584bb3c8177c72c93b8bc60cfd8583faf52e9d74e3b32955e8a8583bd" Mar 10 00:13:01 crc kubenswrapper[4994]: E0310 00:13:01.158471 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-tgf68_openshift-marketplace(b85bbdaa-daa8-4c69-abf9-9f1200eb07cd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.186199 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.222995 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.224746 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.259190 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.292403 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.517622 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.536348 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.926314 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.947741 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.966231 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.061025 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.061095 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.169371 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/1.log" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.170103 4994 scope.go:117] "RemoveContainer" containerID="7bce6eb584bb3c8177c72c93b8bc60cfd8583faf52e9d74e3b32955e8a8583bd" Mar 10 00:13:02 crc kubenswrapper[4994]: E0310 00:13:02.170408 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-tgf68_openshift-marketplace(b85bbdaa-daa8-4c69-abf9-9f1200eb07cd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.283312 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.322807 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.483220 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.537458 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.817565 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.974867 4994 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.122193 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.129992 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.173700 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.469023 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.495866 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.621363 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.677069 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.758705 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.811183 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.868307 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.902269 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.927663 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.957199 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.076527 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.101746 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.147917 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.165047 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.259652 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.265560 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.401834 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.447342 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.472362 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.485072 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.555656 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.595488 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.743915 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.921386 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 00:13:05 crc kubenswrapper[4994]: I0310 00:13:05.425468 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 00:13:05 crc kubenswrapper[4994]: I0310 00:13:05.440019 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 00:13:05 crc kubenswrapper[4994]: I0310 00:13:05.803944 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 00:13:05 crc kubenswrapper[4994]: I0310 00:13:05.836784 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 00:13:05 crc kubenswrapper[4994]: I0310 00:13:05.890751 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.061064 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.158475 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.219922 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.263203 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.289397 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.335867 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.380653 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.546335 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.569657 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.611205 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.765408 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.793951 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.916116 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.151931 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.453656 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.485207 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.528042 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.567856 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.622600 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.669567 4994 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.691711 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.886385 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.138511 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.155823 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.209592 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.377219 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.489571 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.604903 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.692032 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.867049 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.049977 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.116748 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.188307 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.257194 4994 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.321531 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.332985 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.383306 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.554935 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.612648 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.622702 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.749565 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.773971 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.818133 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.029074 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.074567 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.125502 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.197005 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.220300 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.224533 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.263509 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.324983 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.325274 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.453619 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.466249 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.473405 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.585654 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.891192 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.970558 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.198320 4994 scope.go:117] "RemoveContainer" containerID="ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.222530 4994 scope.go:117] "RemoveContainer" containerID="5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.246985 4994 scope.go:117] "RemoveContainer" containerID="71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.267330 4994 scope.go:117] "RemoveContainer" containerID="ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.289004 4994 scope.go:117] "RemoveContainer" containerID="b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.301552 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.389975 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.507719 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.702245 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.716352 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.723186 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.793656 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.836468 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.941234 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.014231 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.194212 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.273398 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.276241 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.364372 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.452480 4994 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.478328 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.790141 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.816516 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.816586 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.926829 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.045229 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.068744 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.307711 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.373497 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.392038 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.396360 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.693933 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.034949 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.059345 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.379421 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.407784 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.554290 4994 scope.go:117] "RemoveContainer" containerID="7bce6eb584bb3c8177c72c93b8bc60cfd8583faf52e9d74e3b32955e8a8583bd" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.585606 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.661101 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.727226 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.727293 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.736117 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.892685 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.928035 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.011361 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.155123 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.270418 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/2.log" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.271224 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/1.log" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.271289 4994 generic.go:334] "Generic (PLEG): container finished" podID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerID="e7fca198849ed5918e32d90439680c268c9d2e25eff70cc1e2ce92503bc67d85" exitCode=1 Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.271328 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" event={"ID":"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd","Type":"ContainerDied","Data":"e7fca198849ed5918e32d90439680c268c9d2e25eff70cc1e2ce92503bc67d85"} Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.271383 4994 scope.go:117] "RemoveContainer" containerID="7bce6eb584bb3c8177c72c93b8bc60cfd8583faf52e9d74e3b32955e8a8583bd" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.273410 4994 scope.go:117] "RemoveContainer" containerID="e7fca198849ed5918e32d90439680c268c9d2e25eff70cc1e2ce92503bc67d85" Mar 10 00:13:15 crc kubenswrapper[4994]: E0310 00:13:15.274419 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-tgf68_openshift-marketplace(b85bbdaa-daa8-4c69-abf9-9f1200eb07cd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.404816 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.482335 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.489241 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.540134 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.735362 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.778612 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.871540 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.281611 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/2.log" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.347811 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.392970 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.531987 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.627511 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.660250 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.748926 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.102178 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.160809 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.195214 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.399830 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.525559 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.528889 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.690603 4994 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.691388 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c4tz9" podStartSLOduration=87.662213091 podStartE2EDuration="3m15.691362194s" podCreationTimestamp="2026-03-10 00:10:02 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.428679297 +0000 UTC m=+226.602386046" lastFinishedPulling="2026-03-10 00:12:00.45782837 +0000 UTC m=+334.631535149" observedRunningTime="2026-03-10 00:12:23.201989372 +0000 UTC m=+357.375696131" watchObservedRunningTime="2026-03-10 00:13:17.691362194 +0000 UTC m=+411.865068983" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.691557 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bzrd2" podStartSLOduration=73.967708237 podStartE2EDuration="3m12.691549919s" podCreationTimestamp="2026-03-10 00:10:05 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.461758126 +0000 UTC m=+226.635464875" lastFinishedPulling="2026-03-10 00:12:11.185599768 +0000 UTC m=+345.359306557" observedRunningTime="2026-03-10 00:12:23.257524618 +0000 UTC m=+357.431231377" watchObservedRunningTime="2026-03-10 00:13:17.691549919 +0000 UTC m=+411.865256698" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.693788 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s7qcn" podStartSLOduration=86.716270785 podStartE2EDuration="3m14.693775054s" podCreationTimestamp="2026-03-10 00:10:03 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.479856419 +0000 UTC m=+226.653563168" lastFinishedPulling="2026-03-10 00:12:00.457360648 +0000 UTC m=+334.631067437" observedRunningTime="2026-03-10 00:12:23.12237734 +0000 UTC m=+357.296084159" watchObservedRunningTime="2026-03-10 00:13:17.693775054 +0000 UTC m=+411.867481843" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.694076 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zv2kt" podStartSLOduration=84.258518061 podStartE2EDuration="3m15.694067971s" podCreationTimestamp="2026-03-10 00:10:02 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.477159162 +0000 UTC m=+226.650865911" lastFinishedPulling="2026-03-10 00:12:03.912709072 +0000 UTC m=+338.086415821" observedRunningTime="2026-03-10 00:12:23.290094353 +0000 UTC m=+357.463801102" watchObservedRunningTime="2026-03-10 00:13:17.694067971 +0000 UTC m=+411.867774770" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.696137 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t5kj4" podStartSLOduration=71.667067255 podStartE2EDuration="3m11.696126513s" podCreationTimestamp="2026-03-10 00:10:06 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.468096984 +0000 UTC m=+226.641803743" lastFinishedPulling="2026-03-10 00:12:12.497156242 +0000 UTC m=+346.670863001" observedRunningTime="2026-03-10 00:12:23.144680031 +0000 UTC m=+357.318386780" watchObservedRunningTime="2026-03-10 00:13:17.696126513 +0000 UTC m=+411.869833292" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.696410 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bwzk5" podStartSLOduration=77.00224967 podStartE2EDuration="3m15.69640182s" podCreationTimestamp="2026-03-10 00:10:02 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.492788593 +0000 UTC m=+226.666495342" lastFinishedPulling="2026-03-10 00:12:11.186940703 +0000 UTC m=+345.360647492" observedRunningTime="2026-03-10 00:12:23.170935795 +0000 UTC m=+357.344642554" watchObservedRunningTime="2026-03-10 00:13:17.69640182 +0000 UTC m=+411.870108609" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.697143 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hrh9x" podStartSLOduration=83.340264518 podStartE2EDuration="3m13.697134298s" podCreationTimestamp="2026-03-10 00:10:04 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.436264708 +0000 UTC m=+226.609971447" lastFinishedPulling="2026-03-10 00:12:02.793134428 +0000 UTC m=+336.966841227" observedRunningTime="2026-03-10 00:12:23.230209937 +0000 UTC m=+357.403916696" watchObservedRunningTime="2026-03-10 00:13:17.697134298 +0000 UTC m=+411.870841077" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.699962 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.700038 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-infra/auto-csr-approver-29551692-29hls"] Mar 10 00:13:17 crc kubenswrapper[4994]: E0310 00:13:17.700338 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" containerName="installer" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.700357 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" containerName="installer" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.700546 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" containerName="installer" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.701218 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.703915 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.704341 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.704637 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.709747 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.725569 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnm5w\" (UniqueName: \"kubernetes.io/projected/24a70a0f-0e78-4f55-9eee-62099acf734d-kube-api-access-dnm5w\") pod \"auto-csr-approver-29551692-29hls\" (UID: \"24a70a0f-0e78-4f55-9eee-62099acf734d\") " pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.730673 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=54.73064553 podStartE2EDuration="54.73064553s" podCreationTimestamp="2026-03-10 00:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:13:17.727479461 +0000 UTC m=+411.901186240" watchObservedRunningTime="2026-03-10 00:13:17.73064553 +0000 UTC m=+411.904352279" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.748490 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=22.748465932 podStartE2EDuration="22.748465932s" podCreationTimestamp="2026-03-10 00:12:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:13:17.74840647 +0000 UTC m=+411.922113229" watchObservedRunningTime="2026-03-10 00:13:17.748465932 +0000 UTC m=+411.922172701" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.829552 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnm5w\" (UniqueName: \"kubernetes.io/projected/24a70a0f-0e78-4f55-9eee-62099acf734d-kube-api-access-dnm5w\") pod \"auto-csr-approver-29551692-29hls\" (UID: \"24a70a0f-0e78-4f55-9eee-62099acf734d\") " pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.859986 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnm5w\" (UniqueName: \"kubernetes.io/projected/24a70a0f-0e78-4f55-9eee-62099acf734d-kube-api-access-dnm5w\") pod \"auto-csr-approver-29551692-29hls\" (UID: \"24a70a0f-0e78-4f55-9eee-62099acf734d\") " pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:18 crc kubenswrapper[4994]: I0310 00:13:18.024678 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:18 crc kubenswrapper[4994]: I0310 00:13:18.064103 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 00:13:18 crc kubenswrapper[4994]: I0310 00:13:18.211140 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 00:13:18 crc kubenswrapper[4994]: I0310 00:13:18.244414 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 00:13:18 crc kubenswrapper[4994]: I0310 00:13:18.456472 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551692-29hls"] Mar 10 00:13:18 crc kubenswrapper[4994]: W0310 00:13:18.468152 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a70a0f_0e78_4f55_9eee_62099acf734d.slice/crio-87f833a4ec9d30c9292ef3b7519a0b401b021a95f8b7dbbbbcf037633a46166b WatchSource:0}: Error finding container 87f833a4ec9d30c9292ef3b7519a0b401b021a95f8b7dbbbbcf037633a46166b: Status 404 returned error can't find the container with id 87f833a4ec9d30c9292ef3b7519a0b401b021a95f8b7dbbbbcf037633a46166b Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.260147 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.303154 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551692-29hls" event={"ID":"24a70a0f-0e78-4f55-9eee-62099acf734d","Type":"ContainerStarted","Data":"87f833a4ec9d30c9292ef3b7519a0b401b021a95f8b7dbbbbcf037633a46166b"} Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.335761 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.660932 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.759778 4994 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.760169 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f159c5160bbebc2d55cd9bb33ea390e800dbff7ea9620c436631139ca88c6b3b" gracePeriod=5 Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.839637 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 00:13:20 crc kubenswrapper[4994]: I0310 00:13:20.082434 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 00:13:20 crc kubenswrapper[4994]: I0310 00:13:20.269450 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 00:13:20 crc kubenswrapper[4994]: I0310 00:13:20.313926 4994 generic.go:334] "Generic (PLEG): container finished" podID="24a70a0f-0e78-4f55-9eee-62099acf734d" containerID="ef1f80910f9e65f34790675bdb343fd6ffef0cbe9f22353df047c95ba63843ec" exitCode=0 Mar 10 00:13:20 crc kubenswrapper[4994]: I0310 00:13:20.314020 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551692-29hls" event={"ID":"24a70a0f-0e78-4f55-9eee-62099acf734d","Type":"ContainerDied","Data":"ef1f80910f9e65f34790675bdb343fd6ffef0cbe9f22353df047c95ba63843ec"} Mar 10 00:13:20 crc kubenswrapper[4994]: I0310 00:13:20.401099 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 00:13:20 crc kubenswrapper[4994]: I0310 00:13:20.854682 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.025544 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.077348 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.293460 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.425428 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.478182 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.513489 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.617055 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.638429 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.684750 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnm5w\" (UniqueName: \"kubernetes.io/projected/24a70a0f-0e78-4f55-9eee-62099acf734d-kube-api-access-dnm5w\") pod \"24a70a0f-0e78-4f55-9eee-62099acf734d\" (UID: \"24a70a0f-0e78-4f55-9eee-62099acf734d\") " Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.695256 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a70a0f-0e78-4f55-9eee-62099acf734d-kube-api-access-dnm5w" (OuterVolumeSpecName: "kube-api-access-dnm5w") pod "24a70a0f-0e78-4f55-9eee-62099acf734d" (UID: "24a70a0f-0e78-4f55-9eee-62099acf734d"). InnerVolumeSpecName "kube-api-access-dnm5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.778967 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.792250 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnm5w\" (UniqueName: \"kubernetes.io/projected/24a70a0f-0e78-4f55-9eee-62099acf734d-kube-api-access-dnm5w\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.060248 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.060314 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.060909 4994 scope.go:117] "RemoveContainer" containerID="e7fca198849ed5918e32d90439680c268c9d2e25eff70cc1e2ce92503bc67d85" Mar 10 00:13:22 crc kubenswrapper[4994]: E0310 00:13:22.061409 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-tgf68_openshift-marketplace(b85bbdaa-daa8-4c69-abf9-9f1200eb07cd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.295014 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.333158 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551692-29hls" event={"ID":"24a70a0f-0e78-4f55-9eee-62099acf734d","Type":"ContainerDied","Data":"87f833a4ec9d30c9292ef3b7519a0b401b021a95f8b7dbbbbcf037633a46166b"} Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.333467 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87f833a4ec9d30c9292ef3b7519a0b401b021a95f8b7dbbbbcf037633a46166b" Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.333230 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:23 crc kubenswrapper[4994]: I0310 00:13:23.744315 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 00:13:23 crc kubenswrapper[4994]: I0310 00:13:23.828161 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 00:13:24 crc kubenswrapper[4994]: I0310 00:13:24.065853 4994 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.018546 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.306216 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.359828 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.359929 4994 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f159c5160bbebc2d55cd9bb33ea390e800dbff7ea9620c436631139ca88c6b3b" exitCode=137 Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.359980 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18a5e48ee962680b8c06a3e5a5396e0f805c9148a02be8cc74fe52d50407bdb6" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.378829 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.378928 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458639 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458701 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458784 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458844 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458904 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458967 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458864 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.459146 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.459253 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.459467 4994 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.459489 4994 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.459508 4994 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.459523 4994 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.470072 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.561000 4994 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.151134 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.367061 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.569651 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.570169 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.585805 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.585854 4994 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a7d7465c-7086-43ed-96f0-3fed2dc918a1" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.593055 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.593110 4994 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a7d7465c-7086-43ed-96f0-3fed2dc918a1" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.641694 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 00:13:27 crc kubenswrapper[4994]: I0310 00:13:27.380805 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 00:13:27 crc kubenswrapper[4994]: I0310 00:13:27.556845 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 00:13:33 crc kubenswrapper[4994]: I0310 00:13:33.555034 4994 scope.go:117] "RemoveContainer" containerID="e7fca198849ed5918e32d90439680c268c9d2e25eff70cc1e2ce92503bc67d85" Mar 10 00:13:33 crc kubenswrapper[4994]: E0310 00:13:33.556345 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-tgf68_openshift-marketplace(b85bbdaa-daa8-4c69-abf9-9f1200eb07cd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.008496 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bwzk5"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.009388 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bwzk5" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="registry-server" containerID="cri-o://8050e9fca1f15bb15fdfd5da3939a33eede922d4e417035502ed98660f52b965" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.025589 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c4tz9"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.025913 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c4tz9" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="registry-server" containerID="cri-o://c85cb3939718589608e5c84bc5f793f0ac91554e53660eb93bb8216dc7e11be6" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.034457 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s7qcn"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.034727 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s7qcn" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="registry-server" containerID="cri-o://06150f7e8e87fb17f0464494725fda3c7ecd07c48589662349d651d3700f6139" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.040179 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zv2kt"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.040387 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zv2kt" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="registry-server" containerID="cri-o://ed0179eebea68ea16a7ca2db77939a49213c1a39d735deda526e14363858855f" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.048345 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgf68"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.058041 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzrd2"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.058677 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bzrd2" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="registry-server" containerID="cri-o://4ac24c0285d02489c6991b060b56e4de350d7660656ffb13b6c269d046020cdf" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.069586 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrh9x"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.069970 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hrh9x" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="registry-server" containerID="cri-o://0872f65b4051bd51ddef4f02f90e17a29e60ddaf92da0bcd501b161644707129" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.076577 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7987b4b568-rd8h8"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.093964 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ppfwk"] Mar 10 00:13:41 crc kubenswrapper[4994]: E0310 00:13:41.094241 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.094255 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 00:13:41 crc kubenswrapper[4994]: E0310 00:13:41.094278 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a70a0f-0e78-4f55-9eee-62099acf734d" containerName="oc" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.094288 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a70a0f-0e78-4f55-9eee-62099acf734d" containerName="oc" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.094420 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.094440 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a70a0f-0e78-4f55-9eee-62099acf734d" containerName="oc" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.094996 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.103729 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t5kj4"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.104009 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t5kj4" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="registry-server" containerID="cri-o://ae53ee029113a53cb7678f130696b300909ff84d74bb704709a978c1c97a24b4" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.110590 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wpd8k"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.110831 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wpd8k" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="registry-server" containerID="cri-o://68b888f4cbe6324465ef819f044ee90195b5c11bf20ab3d53a6faef3352a36c6" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.146864 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ppfwk"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.166286 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.166533 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" podUID="3653335d-178c-4df8-a93d-4d19011298fe" containerName="route-controller-manager" containerID="cri-o://fb93dc31c2039db01d5ef603fb8cadf69dce66a5580028d2b9b19cc49d45e88a" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: E0310 00:13:41.179499 4994 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4a4dc2d_502f_4c05_ab76_1cc708f13006.slice/crio-0872f65b4051bd51ddef4f02f90e17a29e60ddaf92da0bcd501b161644707129.scope\": RecentStats: unable to find data in memory cache]" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.202340 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.202380 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcft9\" (UniqueName: \"kubernetes.io/projected/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-kube-api-access-vcft9\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.202434 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.303012 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.303060 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcft9\" (UniqueName: \"kubernetes.io/projected/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-kube-api-access-vcft9\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.303115 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.304799 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.315725 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.317792 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcft9\" (UniqueName: \"kubernetes.io/projected/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-kube-api-access-vcft9\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.484461 4994 generic.go:334] "Generic (PLEG): container finished" podID="6525b40b-1c23-4533-a025-4d86bc406f00" containerID="68b888f4cbe6324465ef819f044ee90195b5c11bf20ab3d53a6faef3352a36c6" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.484823 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerDied","Data":"68b888f4cbe6324465ef819f044ee90195b5c11bf20ab3d53a6faef3352a36c6"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.502787 4994 generic.go:334] "Generic (PLEG): container finished" podID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerID="c85cb3939718589608e5c84bc5f793f0ac91554e53660eb93bb8216dc7e11be6" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.502828 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4tz9" event={"ID":"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5","Type":"ContainerDied","Data":"c85cb3939718589608e5c84bc5f793f0ac91554e53660eb93bb8216dc7e11be6"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.508743 4994 generic.go:334] "Generic (PLEG): container finished" podID="0429fae4-1356-4d61-86a3-267f74f27636" containerID="ae53ee029113a53cb7678f130696b300909ff84d74bb704709a978c1c97a24b4" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.508794 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerDied","Data":"ae53ee029113a53cb7678f130696b300909ff84d74bb704709a978c1c97a24b4"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.514518 4994 generic.go:334] "Generic (PLEG): container finished" podID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerID="0872f65b4051bd51ddef4f02f90e17a29e60ddaf92da0bcd501b161644707129" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.514566 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrh9x" event={"ID":"a4a4dc2d-502f-4c05-ab76-1cc708f13006","Type":"ContainerDied","Data":"0872f65b4051bd51ddef4f02f90e17a29e60ddaf92da0bcd501b161644707129"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.517290 4994 generic.go:334] "Generic (PLEG): container finished" podID="fdad0261-804d-41dc-8a25-48018f136c0f" containerID="8050e9fca1f15bb15fdfd5da3939a33eede922d4e417035502ed98660f52b965" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.517385 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwzk5" event={"ID":"fdad0261-804d-41dc-8a25-48018f136c0f","Type":"ContainerDied","Data":"8050e9fca1f15bb15fdfd5da3939a33eede922d4e417035502ed98660f52b965"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.520080 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/2.log" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.520190 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" event={"ID":"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd","Type":"ContainerDied","Data":"18370ebb096ba95df384c4822d63f9eeb86d553280650a21652dc981b373eee5"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.520320 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18370ebb096ba95df384c4822d63f9eeb86d553280650a21652dc981b373eee5" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.524824 4994 generic.go:334] "Generic (PLEG): container finished" podID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerID="ed0179eebea68ea16a7ca2db77939a49213c1a39d735deda526e14363858855f" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.524886 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zv2kt" event={"ID":"76aa065c-ed60-4237-b36f-5ce2865256ff","Type":"ContainerDied","Data":"ed0179eebea68ea16a7ca2db77939a49213c1a39d735deda526e14363858855f"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.529278 4994 generic.go:334] "Generic (PLEG): container finished" podID="3653335d-178c-4df8-a93d-4d19011298fe" containerID="fb93dc31c2039db01d5ef603fb8cadf69dce66a5580028d2b9b19cc49d45e88a" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.529323 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" event={"ID":"3653335d-178c-4df8-a93d-4d19011298fe","Type":"ContainerDied","Data":"fb93dc31c2039db01d5ef603fb8cadf69dce66a5580028d2b9b19cc49d45e88a"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.533882 4994 generic.go:334] "Generic (PLEG): container finished" podID="abe30cce-8379-4db8-838b-f48b4bc96621" containerID="06150f7e8e87fb17f0464494725fda3c7ecd07c48589662349d651d3700f6139" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.533926 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerDied","Data":"06150f7e8e87fb17f0464494725fda3c7ecd07c48589662349d651d3700f6139"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.536336 4994 generic.go:334] "Generic (PLEG): container finished" podID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerID="4ac24c0285d02489c6991b060b56e4de350d7660656ffb13b6c269d046020cdf" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.536363 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzrd2" event={"ID":"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c","Type":"ContainerDied","Data":"4ac24c0285d02489c6991b060b56e4de350d7660656ffb13b6c269d046020cdf"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.536501 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" podUID="ae5ec419-c993-43ff-b664-703b8b5a3d5a" containerName="controller-manager" containerID="cri-o://436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.727253 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.730796 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/2.log" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.730862 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.736169 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.741335 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.750965 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.758036 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.763666 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.771700 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.780804 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.785570 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.792348 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.808945 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-catalog-content\") pod \"fdad0261-804d-41dc-8a25-48018f136c0f\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809024 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fbrm\" (UniqueName: \"kubernetes.io/projected/abe30cce-8379-4db8-838b-f48b4bc96621-kube-api-access-2fbrm\") pod \"abe30cce-8379-4db8-838b-f48b4bc96621\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809076 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics\") pod \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809110 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-utilities\") pod \"fdad0261-804d-41dc-8a25-48018f136c0f\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809431 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-utilities\") pod \"abe30cce-8379-4db8-838b-f48b4bc96621\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809466 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l95st\" (UniqueName: \"kubernetes.io/projected/fdad0261-804d-41dc-8a25-48018f136c0f-kube-api-access-l95st\") pod \"fdad0261-804d-41dc-8a25-48018f136c0f\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809491 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-catalog-content\") pod \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809634 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca\") pod \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809738 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-utilities\") pod \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809759 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-catalog-content\") pod \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809778 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-utilities\") pod \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809797 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-catalog-content\") pod \"abe30cce-8379-4db8-838b-f48b4bc96621\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809827 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdkv4\" (UniqueName: \"kubernetes.io/projected/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-kube-api-access-fdkv4\") pod \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809846 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7278l\" (UniqueName: \"kubernetes.io/projected/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-kube-api-access-7278l\") pod \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809910 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfh8s\" (UniqueName: \"kubernetes.io/projected/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-kube-api-access-xfh8s\") pod \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.811477 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-utilities" (OuterVolumeSpecName: "utilities") pod "64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" (UID: "64ec1b6f-2c0f-4cfc-be18-a2d311fae68c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.812422 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-utilities" (OuterVolumeSpecName: "utilities") pod "abe30cce-8379-4db8-838b-f48b4bc96621" (UID: "abe30cce-8379-4db8-838b-f48b4bc96621"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.813559 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-utilities" (OuterVolumeSpecName: "utilities") pod "ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" (UID: "ab6cd76f-6272-4fcd-8c75-3040c45ef1b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.815585 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-kube-api-access-fdkv4" (OuterVolumeSpecName: "kube-api-access-fdkv4") pod "64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" (UID: "64ec1b6f-2c0f-4cfc-be18-a2d311fae68c"). InnerVolumeSpecName "kube-api-access-fdkv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.819387 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" (UID: "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.821360 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-kube-api-access-xfh8s" (OuterVolumeSpecName: "kube-api-access-xfh8s") pod "ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" (UID: "ab6cd76f-6272-4fcd-8c75-3040c45ef1b5"). InnerVolumeSpecName "kube-api-access-xfh8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.826427 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" (UID: "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.829332 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-kube-api-access-7278l" (OuterVolumeSpecName: "kube-api-access-7278l") pod "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" (UID: "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd"). InnerVolumeSpecName "kube-api-access-7278l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.840088 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdad0261-804d-41dc-8a25-48018f136c0f-kube-api-access-l95st" (OuterVolumeSpecName: "kube-api-access-l95st") pod "fdad0261-804d-41dc-8a25-48018f136c0f" (UID: "fdad0261-804d-41dc-8a25-48018f136c0f"). InnerVolumeSpecName "kube-api-access-l95st". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.858053 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe30cce-8379-4db8-838b-f48b4bc96621-kube-api-access-2fbrm" (OuterVolumeSpecName: "kube-api-access-2fbrm") pod "abe30cce-8379-4db8-838b-f48b4bc96621" (UID: "abe30cce-8379-4db8-838b-f48b4bc96621"). InnerVolumeSpecName "kube-api-access-2fbrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.859099 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-utilities" (OuterVolumeSpecName: "utilities") pod "fdad0261-804d-41dc-8a25-48018f136c0f" (UID: "fdad0261-804d-41dc-8a25-48018f136c0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.859154 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" (UID: "64ec1b6f-2c0f-4cfc-be18-a2d311fae68c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.875312 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abe30cce-8379-4db8-838b-f48b4bc96621" (UID: "abe30cce-8379-4db8-838b-f48b4bc96621"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.879217 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912553 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-catalog-content\") pod \"6525b40b-1c23-4533-a025-4d86bc406f00\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912604 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frwzt\" (UniqueName: \"kubernetes.io/projected/0429fae4-1356-4d61-86a3-267f74f27636-kube-api-access-frwzt\") pod \"0429fae4-1356-4d61-86a3-267f74f27636\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912645 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-config\") pod \"3653335d-178c-4df8-a93d-4d19011298fe\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912677 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-utilities\") pod \"0429fae4-1356-4d61-86a3-267f74f27636\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912707 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-utilities\") pod \"76aa065c-ed60-4237-b36f-5ce2865256ff\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912751 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-catalog-content\") pod \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912771 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-utilities\") pod \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912805 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3653335d-178c-4df8-a93d-4d19011298fe-serving-cert\") pod \"3653335d-178c-4df8-a93d-4d19011298fe\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912835 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-utilities\") pod \"6525b40b-1c23-4533-a025-4d86bc406f00\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912860 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btt9c\" (UniqueName: \"kubernetes.io/projected/a4a4dc2d-502f-4c05-ab76-1cc708f13006-kube-api-access-btt9c\") pod \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912898 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8zbr\" (UniqueName: \"kubernetes.io/projected/3653335d-178c-4df8-a93d-4d19011298fe-kube-api-access-f8zbr\") pod \"3653335d-178c-4df8-a93d-4d19011298fe\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912922 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-catalog-content\") pod \"0429fae4-1356-4d61-86a3-267f74f27636\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912946 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-client-ca\") pod \"3653335d-178c-4df8-a93d-4d19011298fe\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912971 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-catalog-content\") pod \"76aa065c-ed60-4237-b36f-5ce2865256ff\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.913010 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lx2s\" (UniqueName: \"kubernetes.io/projected/76aa065c-ed60-4237-b36f-5ce2865256ff-kube-api-access-5lx2s\") pod \"76aa065c-ed60-4237-b36f-5ce2865256ff\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.914584 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-utilities" (OuterVolumeSpecName: "utilities") pod "0429fae4-1356-4d61-86a3-267f74f27636" (UID: "0429fae4-1356-4d61-86a3-267f74f27636"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.914840 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-utilities" (OuterVolumeSpecName: "utilities") pod "76aa065c-ed60-4237-b36f-5ce2865256ff" (UID: "76aa065c-ed60-4237-b36f-5ce2865256ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.915342 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-utilities" (OuterVolumeSpecName: "utilities") pod "a4a4dc2d-502f-4c05-ab76-1cc708f13006" (UID: "a4a4dc2d-502f-4c05-ab76-1cc708f13006"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.915518 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpl2h\" (UniqueName: \"kubernetes.io/projected/6525b40b-1c23-4533-a025-4d86bc406f00-kube-api-access-xpl2h\") pod \"6525b40b-1c23-4533-a025-4d86bc406f00\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.916527 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-utilities" (OuterVolumeSpecName: "utilities") pod "6525b40b-1c23-4533-a025-4d86bc406f00" (UID: "6525b40b-1c23-4533-a025-4d86bc406f00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918627 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918646 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfh8s\" (UniqueName: \"kubernetes.io/projected/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-kube-api-access-xfh8s\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918664 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fbrm\" (UniqueName: \"kubernetes.io/projected/abe30cce-8379-4db8-838b-f48b4bc96621-kube-api-access-2fbrm\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918675 4994 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918686 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918697 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918707 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l95st\" (UniqueName: \"kubernetes.io/projected/fdad0261-804d-41dc-8a25-48018f136c0f-kube-api-access-l95st\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918717 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918726 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918738 4994 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918747 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918756 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918768 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918780 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918855 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918866 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdkv4\" (UniqueName: \"kubernetes.io/projected/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-kube-api-access-fdkv4\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918949 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7278l\" (UniqueName: \"kubernetes.io/projected/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-kube-api-access-7278l\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.922797 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0429fae4-1356-4d61-86a3-267f74f27636-kube-api-access-frwzt" (OuterVolumeSpecName: "kube-api-access-frwzt") pod "0429fae4-1356-4d61-86a3-267f74f27636" (UID: "0429fae4-1356-4d61-86a3-267f74f27636"). InnerVolumeSpecName "kube-api-access-frwzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.925669 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-config" (OuterVolumeSpecName: "config") pod "3653335d-178c-4df8-a93d-4d19011298fe" (UID: "3653335d-178c-4df8-a93d-4d19011298fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.925935 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a4dc2d-502f-4c05-ab76-1cc708f13006-kube-api-access-btt9c" (OuterVolumeSpecName: "kube-api-access-btt9c") pod "a4a4dc2d-502f-4c05-ab76-1cc708f13006" (UID: "a4a4dc2d-502f-4c05-ab76-1cc708f13006"). InnerVolumeSpecName "kube-api-access-btt9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.926267 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-client-ca" (OuterVolumeSpecName: "client-ca") pod "3653335d-178c-4df8-a93d-4d19011298fe" (UID: "3653335d-178c-4df8-a93d-4d19011298fe"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.927987 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3653335d-178c-4df8-a93d-4d19011298fe-kube-api-access-f8zbr" (OuterVolumeSpecName: "kube-api-access-f8zbr") pod "3653335d-178c-4df8-a93d-4d19011298fe" (UID: "3653335d-178c-4df8-a93d-4d19011298fe"). InnerVolumeSpecName "kube-api-access-f8zbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.929265 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6525b40b-1c23-4533-a025-4d86bc406f00-kube-api-access-xpl2h" (OuterVolumeSpecName: "kube-api-access-xpl2h") pod "6525b40b-1c23-4533-a025-4d86bc406f00" (UID: "6525b40b-1c23-4533-a025-4d86bc406f00"). InnerVolumeSpecName "kube-api-access-xpl2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.937986 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76aa065c-ed60-4237-b36f-5ce2865256ff-kube-api-access-5lx2s" (OuterVolumeSpecName: "kube-api-access-5lx2s") pod "76aa065c-ed60-4237-b36f-5ce2865256ff" (UID: "76aa065c-ed60-4237-b36f-5ce2865256ff"). InnerVolumeSpecName "kube-api-access-5lx2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.938426 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3653335d-178c-4df8-a93d-4d19011298fe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3653335d-178c-4df8-a93d-4d19011298fe" (UID: "3653335d-178c-4df8-a93d-4d19011298fe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.969535 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" (UID: "ab6cd76f-6272-4fcd-8c75-3040c45ef1b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.971131 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4a4dc2d-502f-4c05-ab76-1cc708f13006" (UID: "a4a4dc2d-502f-4c05-ab76-1cc708f13006"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.983849 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76aa065c-ed60-4237-b36f-5ce2865256ff" (UID: "76aa065c-ed60-4237-b36f-5ce2865256ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.985633 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdad0261-804d-41dc-8a25-48018f136c0f" (UID: "fdad0261-804d-41dc-8a25-48018f136c0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019359 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-config\") pod \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019444 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-client-ca\") pod \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019476 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5ec419-c993-43ff-b664-703b8b5a3d5a-serving-cert\") pod \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019495 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-proxy-ca-bundles\") pod \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019542 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9lft\" (UniqueName: \"kubernetes.io/projected/ae5ec419-c993-43ff-b664-703b8b5a3d5a-kube-api-access-p9lft\") pod \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019751 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frwzt\" (UniqueName: \"kubernetes.io/projected/0429fae4-1356-4d61-86a3-267f74f27636-kube-api-access-frwzt\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019764 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019774 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019783 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019791 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3653335d-178c-4df8-a93d-4d19011298fe-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019799 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btt9c\" (UniqueName: \"kubernetes.io/projected/a4a4dc2d-502f-4c05-ab76-1cc708f13006-kube-api-access-btt9c\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019807 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8zbr\" (UniqueName: \"kubernetes.io/projected/3653335d-178c-4df8-a93d-4d19011298fe-kube-api-access-f8zbr\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019815 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019823 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019831 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019839 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lx2s\" (UniqueName: \"kubernetes.io/projected/76aa065c-ed60-4237-b36f-5ce2865256ff-kube-api-access-5lx2s\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019848 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpl2h\" (UniqueName: \"kubernetes.io/projected/6525b40b-1c23-4533-a025-4d86bc406f00-kube-api-access-xpl2h\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.020212 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-client-ca" (OuterVolumeSpecName: "client-ca") pod "ae5ec419-c993-43ff-b664-703b8b5a3d5a" (UID: "ae5ec419-c993-43ff-b664-703b8b5a3d5a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.020292 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-config" (OuterVolumeSpecName: "config") pod "ae5ec419-c993-43ff-b664-703b8b5a3d5a" (UID: "ae5ec419-c993-43ff-b664-703b8b5a3d5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.020524 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ae5ec419-c993-43ff-b664-703b8b5a3d5a" (UID: "ae5ec419-c993-43ff-b664-703b8b5a3d5a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.023592 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5ec419-c993-43ff-b664-703b8b5a3d5a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ae5ec419-c993-43ff-b664-703b8b5a3d5a" (UID: "ae5ec419-c993-43ff-b664-703b8b5a3d5a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.025069 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5ec419-c993-43ff-b664-703b8b5a3d5a-kube-api-access-p9lft" (OuterVolumeSpecName: "kube-api-access-p9lft") pod "ae5ec419-c993-43ff-b664-703b8b5a3d5a" (UID: "ae5ec419-c993-43ff-b664-703b8b5a3d5a"). InnerVolumeSpecName "kube-api-access-p9lft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.069285 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0429fae4-1356-4d61-86a3-267f74f27636" (UID: "0429fae4-1356-4d61-86a3-267f74f27636"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.074323 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6525b40b-1c23-4533-a025-4d86bc406f00" (UID: "6525b40b-1c23-4533-a025-4d86bc406f00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120491 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120522 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5ec419-c993-43ff-b664-703b8b5a3d5a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120530 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120541 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120551 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9lft\" (UniqueName: \"kubernetes.io/projected/ae5ec419-c993-43ff-b664-703b8b5a3d5a-kube-api-access-p9lft\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120561 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120569 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.145328 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ppfwk"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.528959 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56bcc8b679-5m2f4"] Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529697 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529719 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529730 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529740 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529753 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529761 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529771 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529782 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529794 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529802 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529812 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3653335d-178c-4df8-a93d-4d19011298fe" containerName="route-controller-manager" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529819 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3653335d-178c-4df8-a93d-4d19011298fe" containerName="route-controller-manager" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529832 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529839 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529849 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529857 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529896 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5ec419-c993-43ff-b664-703b8b5a3d5a" containerName="controller-manager" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529907 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5ec419-c993-43ff-b664-703b8b5a3d5a" containerName="controller-manager" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529922 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529931 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529940 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529948 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529959 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529968 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529980 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529990 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529999 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530007 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530017 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530024 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530041 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530049 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530059 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530067 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530078 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530086 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530123 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530131 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530143 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530152 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530164 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530173 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530188 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530198 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530211 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530222 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530235 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530242 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530253 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530261 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530270 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530277 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530290 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530297 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530307 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530314 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530325 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530332 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530459 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530473 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530482 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530493 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530503 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530515 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5ec419-c993-43ff-b664-703b8b5a3d5a" containerName="controller-manager" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530525 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530537 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530548 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530558 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530570 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530582 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530592 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="3653335d-178c-4df8-a93d-4d19011298fe" containerName="route-controller-manager" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.531158 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.531931 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.533366 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.544435 4994 generic.go:334] "Generic (PLEG): container finished" podID="ae5ec419-c993-43ff-b664-703b8b5a3d5a" containerID="436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514" exitCode=0 Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.544526 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.544555 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" event={"ID":"ae5ec419-c993-43ff-b664-703b8b5a3d5a","Type":"ContainerDied","Data":"436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.544597 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" event={"ID":"ae5ec419-c993-43ff-b664-703b8b5a3d5a","Type":"ContainerDied","Data":"83f86e0cd6062d10a147cbaca0b7678a7e90a57f0aee2f43c47901ded05219f4"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.544625 4994 scope.go:117] "RemoveContainer" containerID="436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.548006 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzrd2" event={"ID":"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c","Type":"ContainerDied","Data":"d1b32d28a2daabcbb6951ddc2404e012b74605f090a8de0ccde979112a9da8a3"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.548189 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.553478 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" event={"ID":"3653335d-178c-4df8-a93d-4d19011298fe","Type":"ContainerDied","Data":"86f6391bf2290461b2367cbde0871fb0815774ff3d6099505b8889c9a6ec884a"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.553655 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.561773 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.572757 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.574141 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56bcc8b679-5m2f4"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.574194 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.574217 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerDied","Data":"45f9adf9166a73c98a80bad9d037f2560042ccbdeec2aa18a7cbfa8528d64c9a"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.574251 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwzk5" event={"ID":"fdad0261-804d-41dc-8a25-48018f136c0f","Type":"ContainerDied","Data":"97695118e06c708e4423796822ac54cea80fe3fc3b7289e71d5f6ac300dfeb72"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.579790 4994 scope.go:117] "RemoveContainer" containerID="436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.580283 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514\": container with ID starting with 436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514 not found: ID does not exist" containerID="436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.580332 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514"} err="failed to get container status \"436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514\": rpc error: code = NotFound desc = could not find container \"436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514\": container with ID starting with 436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514 not found: ID does not exist" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.580366 4994 scope.go:117] "RemoveContainer" containerID="4ac24c0285d02489c6991b060b56e4de350d7660656ffb13b6c269d046020cdf" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.583772 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4tz9" event={"ID":"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5","Type":"ContainerDied","Data":"7734e5951b02bb3a0a46ea5a16ee396269ee4c95d8725567063c864699e319c0"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.583890 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.588316 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" event={"ID":"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823","Type":"ContainerStarted","Data":"0fec64e6185812db73ed584e560e0a50856bffe1c60d52e399b5cbca436dfdc9"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.598270 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.599026 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" event={"ID":"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823","Type":"ContainerStarted","Data":"f95cfb0b7116267392dd4c5e2d33e1273f81a2beef99d98c4632979e5c8bb3dc"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.599659 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.600023 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerDied","Data":"c31a76fcba6e0a2edf574624c6292a3f51560169f1f1fa309a2ea336e40231d4"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.600502 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zv2kt" event={"ID":"76aa065c-ed60-4237-b36f-5ce2865256ff","Type":"ContainerDied","Data":"3e81512696f04f227cf371ddcf1556e047699059d617fb7f43f9cba658930f7f"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.600548 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrh9x" event={"ID":"a4a4dc2d-502f-4c05-ab76-1cc708f13006","Type":"ContainerDied","Data":"5863f96f41db6fc401ecb7000e3f5a6cfef96961acdd0f3f461004c58668116e"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.590053 4994 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ppfwk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" start-of-body= Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.600658 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" podUID="46c4619e-ab9f-4fd9-9f3e-5b7ba9415823" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.592226 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.597181 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.602685 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.602932 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerDied","Data":"a14572119f65e2d0fbfc63101455582a2b9abfe6948028817dc155c8d3a9c7ab"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.603172 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628144 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5zm\" (UniqueName: \"kubernetes.io/projected/d3433a2f-3af8-44f9-bb23-67dac303c015-kube-api-access-8w5zm\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628219 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-client-ca\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628251 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-proxy-ca-bundles\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628277 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-config\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628321 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vrzp\" (UniqueName: \"kubernetes.io/projected/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-kube-api-access-9vrzp\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628353 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-client-ca\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628368 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-config\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628395 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-serving-cert\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628412 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3433a2f-3af8-44f9-bb23-67dac303c015-serving-cert\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.644764 4994 scope.go:117] "RemoveContainer" containerID="8a4cdb4758a8d66ac4d964d75c363e426be3ea4f0d96bd2b4370bc01dbce1a3f" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.645920 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" podStartSLOduration=1.645891338 podStartE2EDuration="1.645891338s" podCreationTimestamp="2026-03-10 00:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:13:42.638082153 +0000 UTC m=+436.811788902" watchObservedRunningTime="2026-03-10 00:13:42.645891338 +0000 UTC m=+436.819598107" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.671130 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7987b4b568-rd8h8"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.688942 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7987b4b568-rd8h8"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.696626 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzrd2"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.697938 4994 scope.go:117] "RemoveContainer" containerID="80f8634f7b8323c210693d621de8d8f6643dfa095b77ff7b2c7b90894cebf6e9" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.705231 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzrd2"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.712825 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wpd8k"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.716425 4994 scope.go:117] "RemoveContainer" containerID="fb93dc31c2039db01d5ef603fb8cadf69dce66a5580028d2b9b19cc49d45e88a" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.724422 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wpd8k"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729583 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-config\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729636 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vrzp\" (UniqueName: \"kubernetes.io/projected/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-kube-api-access-9vrzp\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729668 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-client-ca\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729691 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-config\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729721 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-serving-cert\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729744 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3433a2f-3af8-44f9-bb23-67dac303c015-serving-cert\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729792 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5zm\" (UniqueName: \"kubernetes.io/projected/d3433a2f-3af8-44f9-bb23-67dac303c015-kube-api-access-8w5zm\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729843 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-client-ca\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729886 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-proxy-ca-bundles\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.731425 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-client-ca\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.732297 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-config\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.733560 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-client-ca\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.734901 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-proxy-ca-bundles\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.742639 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3433a2f-3af8-44f9-bb23-67dac303c015-serving-cert\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.753534 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-serving-cert\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.753540 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5zm\" (UniqueName: \"kubernetes.io/projected/d3433a2f-3af8-44f9-bb23-67dac303c015-kube-api-access-8w5zm\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.756715 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vrzp\" (UniqueName: \"kubernetes.io/projected/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-kube-api-access-9vrzp\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.756741 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-config\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.766448 4994 scope.go:117] "RemoveContainer" containerID="68b888f4cbe6324465ef819f044ee90195b5c11bf20ab3d53a6faef3352a36c6" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.789585 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bwzk5"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.792674 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bwzk5"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.797288 4994 scope.go:117] "RemoveContainer" containerID="c11c2f647a7cbb01d788b2a61a4106505500b1d0634fb464e68d4e4b2d159f7e" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.797552 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zv2kt"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.801188 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zv2kt"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.804171 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s7qcn"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.806963 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s7qcn"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.810945 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.814787 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.818990 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgf68"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.822762 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgf68"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.823406 4994 scope.go:117] "RemoveContainer" containerID="aae194d8e4c12d216b3165e539102c03919767ba8d6987e2d169c5147eb55863" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.825725 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t5kj4"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.829914 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t5kj4"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.838691 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c4tz9"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.844271 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c4tz9"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.848236 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrh9x"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.851748 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrh9x"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.854851 4994 scope.go:117] "RemoveContainer" containerID="8050e9fca1f15bb15fdfd5da3939a33eede922d4e417035502ed98660f52b965" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.860484 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.866378 4994 scope.go:117] "RemoveContainer" containerID="2ed048e8f43bfa8a5d112e7ab569e89e26c34e5ab5b69b6c77b3a42aea54c386" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.876353 4994 scope.go:117] "RemoveContainer" containerID="d9c63ca86073ed9073e0f89d99a6a3af753621532c12e53bb512c115a6852ded" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.886263 4994 scope.go:117] "RemoveContainer" containerID="c85cb3939718589608e5c84bc5f793f0ac91554e53660eb93bb8216dc7e11be6" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.901804 4994 scope.go:117] "RemoveContainer" containerID="757e8587473ddc7f21a46feaf304e66dbe443eeebcd4628d091bf2c8bec511d9" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.916120 4994 scope.go:117] "RemoveContainer" containerID="b4775c7cc3dc93dde45a7cd1c8d5a247763c1a3b907807a2f3655ae2194f4c42" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.934248 4994 scope.go:117] "RemoveContainer" containerID="ae53ee029113a53cb7678f130696b300909ff84d74bb704709a978c1c97a24b4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.948389 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.960314 4994 scope.go:117] "RemoveContainer" containerID="979b74ea5a85ff4f5cee3a5418901bcd485b5c3af1206f8500f2ce239b83bc17" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.976137 4994 scope.go:117] "RemoveContainer" containerID="5520b611519111e180e88d1153308daf75771503d82a595371d6519dba75f44f" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.037444 4994 scope.go:117] "RemoveContainer" containerID="ed0179eebea68ea16a7ca2db77939a49213c1a39d735deda526e14363858855f" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.069818 4994 scope.go:117] "RemoveContainer" containerID="6e999aa11768c350f73b98e59c5adafa0716222aea76f3c3cc4ced602c5932bf" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.093708 4994 scope.go:117] "RemoveContainer" containerID="ca43f34122075e40b3a59998e1c0fdcc5eee5438f96f373d4f9e4b36228204ee" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.118027 4994 scope.go:117] "RemoveContainer" containerID="0872f65b4051bd51ddef4f02f90e17a29e60ddaf92da0bcd501b161644707129" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.135894 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8"] Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.144650 4994 scope.go:117] "RemoveContainer" containerID="5fa6127d5cb315c05287e97611be0a26bc929ad831b3970419c02c806f804ed6" Mar 10 00:13:43 crc kubenswrapper[4994]: W0310 00:13:43.163424 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db4bc6b_389f_4ccf_b70e_eb97114f85e6.slice/crio-8fa7c46882c986280d70a7773e802034ebeacd07531eee3a2b9c40b0bf8c4823 WatchSource:0}: Error finding container 8fa7c46882c986280d70a7773e802034ebeacd07531eee3a2b9c40b0bf8c4823: Status 404 returned error can't find the container with id 8fa7c46882c986280d70a7773e802034ebeacd07531eee3a2b9c40b0bf8c4823 Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.167227 4994 scope.go:117] "RemoveContainer" containerID="f5717a500fcfc936aa966df3e8984d98f5ff5ab90d17718d04d543deea170e1a" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.194432 4994 scope.go:117] "RemoveContainer" containerID="06150f7e8e87fb17f0464494725fda3c7ecd07c48589662349d651d3700f6139" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.229178 4994 scope.go:117] "RemoveContainer" containerID="ea2fcc0ccbdd2d99bdd5e8db5934d29568b2c080e467f0b26270d4267b4ac275" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.242342 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56bcc8b679-5m2f4"] Mar 10 00:13:43 crc kubenswrapper[4994]: W0310 00:13:43.253261 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3433a2f_3af8_44f9_bb23_67dac303c015.slice/crio-fc95ae472f51e9ddbc3beb8980e7c48093c9c93d267c9c551e431299fcd96cf9 WatchSource:0}: Error finding container fc95ae472f51e9ddbc3beb8980e7c48093c9c93d267c9c551e431299fcd96cf9: Status 404 returned error can't find the container with id fc95ae472f51e9ddbc3beb8980e7c48093c9c93d267c9c551e431299fcd96cf9 Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.263634 4994 scope.go:117] "RemoveContainer" containerID="b12c2570f0f12ececa7d019201ed8ccc106ee186110fc56077d24c5532fccef4" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.615421 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" event={"ID":"3db4bc6b-389f-4ccf-b70e-eb97114f85e6","Type":"ContainerStarted","Data":"d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6"} Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.615471 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" event={"ID":"3db4bc6b-389f-4ccf-b70e-eb97114f85e6","Type":"ContainerStarted","Data":"8fa7c46882c986280d70a7773e802034ebeacd07531eee3a2b9c40b0bf8c4823"} Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.615713 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.622490 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" event={"ID":"d3433a2f-3af8-44f9-bb23-67dac303c015","Type":"ContainerStarted","Data":"d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175"} Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.622541 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" event={"ID":"d3433a2f-3af8-44f9-bb23-67dac303c015","Type":"ContainerStarted","Data":"fc95ae472f51e9ddbc3beb8980e7c48093c9c93d267c9c551e431299fcd96cf9"} Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.626927 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.639991 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" podStartSLOduration=2.639973813 podStartE2EDuration="2.639973813s" podCreationTimestamp="2026-03-10 00:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:13:43.639209253 +0000 UTC m=+437.812916002" watchObservedRunningTime="2026-03-10 00:13:43.639973813 +0000 UTC m=+437.813680562" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.734102 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" podStartSLOduration=2.73408333 podStartE2EDuration="2.73408333s" podCreationTimestamp="2026-03-10 00:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:13:43.732304826 +0000 UTC m=+437.906011575" watchObservedRunningTime="2026-03-10 00:13:43.73408333 +0000 UTC m=+437.907790079" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.881796 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.561920 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0429fae4-1356-4d61-86a3-267f74f27636" path="/var/lib/kubelet/pods/0429fae4-1356-4d61-86a3-267f74f27636/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.562801 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3653335d-178c-4df8-a93d-4d19011298fe" path="/var/lib/kubelet/pods/3653335d-178c-4df8-a93d-4d19011298fe/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.563425 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" path="/var/lib/kubelet/pods/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.564834 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" path="/var/lib/kubelet/pods/6525b40b-1c23-4533-a025-4d86bc406f00/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.565610 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" path="/var/lib/kubelet/pods/76aa065c-ed60-4237-b36f-5ce2865256ff/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.566958 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" path="/var/lib/kubelet/pods/a4a4dc2d-502f-4c05-ab76-1cc708f13006/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.567906 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" path="/var/lib/kubelet/pods/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.568693 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" path="/var/lib/kubelet/pods/abe30cce-8379-4db8-838b-f48b4bc96621/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.570176 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae5ec419-c993-43ff-b664-703b8b5a3d5a" path="/var/lib/kubelet/pods/ae5ec419-c993-43ff-b664-703b8b5a3d5a/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.570812 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" path="/var/lib/kubelet/pods/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.572058 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" path="/var/lib/kubelet/pods/fdad0261-804d-41dc-8a25-48018f136c0f/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.637178 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.644741 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:48 crc kubenswrapper[4994]: I0310 00:13:48.892724 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:13:48 crc kubenswrapper[4994]: I0310 00:13:48.892818 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.133862 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551694-pngfv"] Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.135036 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.137180 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.137631 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.137912 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.146010 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551694-pngfv"] Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.170776 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7m9d\" (UniqueName: \"kubernetes.io/projected/e91ae1c5-3f03-4439-b579-b828884a1b58-kube-api-access-l7m9d\") pod \"auto-csr-approver-29551694-pngfv\" (UID: \"e91ae1c5-3f03-4439-b579-b828884a1b58\") " pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.271659 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7m9d\" (UniqueName: \"kubernetes.io/projected/e91ae1c5-3f03-4439-b579-b828884a1b58-kube-api-access-l7m9d\") pod \"auto-csr-approver-29551694-pngfv\" (UID: \"e91ae1c5-3f03-4439-b579-b828884a1b58\") " pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.308486 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7m9d\" (UniqueName: \"kubernetes.io/projected/e91ae1c5-3f03-4439-b579-b828884a1b58-kube-api-access-l7m9d\") pod \"auto-csr-approver-29551694-pngfv\" (UID: \"e91ae1c5-3f03-4439-b579-b828884a1b58\") " pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.456662 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.993453 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551694-pngfv"] Mar 10 00:14:01 crc kubenswrapper[4994]: I0310 00:14:01.747804 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551694-pngfv" event={"ID":"e91ae1c5-3f03-4439-b579-b828884a1b58","Type":"ContainerStarted","Data":"e68a4ce84d702f64ff9cdfbd72140a08214e1f8507c6d46e35baf2c9a4a20fa7"} Mar 10 00:14:02 crc kubenswrapper[4994]: I0310 00:14:02.762567 4994 generic.go:334] "Generic (PLEG): container finished" podID="e91ae1c5-3f03-4439-b579-b828884a1b58" containerID="ddb1ff554509065a0634194f412b0e90319b501a3735bf3cda900f518d12f147" exitCode=0 Mar 10 00:14:02 crc kubenswrapper[4994]: I0310 00:14:02.762655 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551694-pngfv" event={"ID":"e91ae1c5-3f03-4439-b579-b828884a1b58","Type":"ContainerDied","Data":"ddb1ff554509065a0634194f412b0e90319b501a3735bf3cda900f518d12f147"} Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.195293 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.323246 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7m9d\" (UniqueName: \"kubernetes.io/projected/e91ae1c5-3f03-4439-b579-b828884a1b58-kube-api-access-l7m9d\") pod \"e91ae1c5-3f03-4439-b579-b828884a1b58\" (UID: \"e91ae1c5-3f03-4439-b579-b828884a1b58\") " Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.330262 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e91ae1c5-3f03-4439-b579-b828884a1b58-kube-api-access-l7m9d" (OuterVolumeSpecName: "kube-api-access-l7m9d") pod "e91ae1c5-3f03-4439-b579-b828884a1b58" (UID: "e91ae1c5-3f03-4439-b579-b828884a1b58"). InnerVolumeSpecName "kube-api-access-l7m9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.424011 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7m9d\" (UniqueName: \"kubernetes.io/projected/e91ae1c5-3f03-4439-b579-b828884a1b58-kube-api-access-l7m9d\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.777373 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551694-pngfv" event={"ID":"e91ae1c5-3f03-4439-b579-b828884a1b58","Type":"ContainerDied","Data":"e68a4ce84d702f64ff9cdfbd72140a08214e1f8507c6d46e35baf2c9a4a20fa7"} Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.777416 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e68a4ce84d702f64ff9cdfbd72140a08214e1f8507c6d46e35baf2c9a4a20fa7" Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.777484 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:05 crc kubenswrapper[4994]: I0310 00:14:05.249352 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551688-9zsf6"] Mar 10 00:14:05 crc kubenswrapper[4994]: I0310 00:14:05.254117 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551688-9zsf6"] Mar 10 00:14:06 crc kubenswrapper[4994]: I0310 00:14:06.566576 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1456dd8-5038-4bcc-8f19-51325ac84c02" path="/var/lib/kubelet/pods/a1456dd8-5038-4bcc-8f19-51325ac84c02/volumes" Mar 10 00:14:11 crc kubenswrapper[4994]: I0310 00:14:11.348830 4994 scope.go:117] "RemoveContainer" containerID="4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.358333 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56bcc8b679-5m2f4"] Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.358862 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" podUID="d3433a2f-3af8-44f9-bb23-67dac303c015" containerName="controller-manager" containerID="cri-o://d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175" gracePeriod=30 Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.457236 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8"] Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.458079 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" podUID="3db4bc6b-389f-4ccf-b70e-eb97114f85e6" containerName="route-controller-manager" containerID="cri-o://d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6" gracePeriod=30 Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.817558 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.822607 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.850082 4994 generic.go:334] "Generic (PLEG): container finished" podID="3db4bc6b-389f-4ccf-b70e-eb97114f85e6" containerID="d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6" exitCode=0 Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.850140 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" event={"ID":"3db4bc6b-389f-4ccf-b70e-eb97114f85e6","Type":"ContainerDied","Data":"d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6"} Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.850166 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" event={"ID":"3db4bc6b-389f-4ccf-b70e-eb97114f85e6","Type":"ContainerDied","Data":"8fa7c46882c986280d70a7773e802034ebeacd07531eee3a2b9c40b0bf8c4823"} Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.850183 4994 scope.go:117] "RemoveContainer" containerID="d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.850272 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.852410 4994 generic.go:334] "Generic (PLEG): container finished" podID="d3433a2f-3af8-44f9-bb23-67dac303c015" containerID="d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175" exitCode=0 Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.852441 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" event={"ID":"d3433a2f-3af8-44f9-bb23-67dac303c015","Type":"ContainerDied","Data":"d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175"} Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.852461 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" event={"ID":"d3433a2f-3af8-44f9-bb23-67dac303c015","Type":"ContainerDied","Data":"fc95ae472f51e9ddbc3beb8980e7c48093c9c93d267c9c551e431299fcd96cf9"} Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.852505 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862665 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-proxy-ca-bundles\") pod \"d3433a2f-3af8-44f9-bb23-67dac303c015\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862706 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-config\") pod \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862753 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-serving-cert\") pod \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862804 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-client-ca\") pod \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862825 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vrzp\" (UniqueName: \"kubernetes.io/projected/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-kube-api-access-9vrzp\") pod \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862867 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3433a2f-3af8-44f9-bb23-67dac303c015-serving-cert\") pod \"d3433a2f-3af8-44f9-bb23-67dac303c015\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862906 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-config\") pod \"d3433a2f-3af8-44f9-bb23-67dac303c015\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862934 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w5zm\" (UniqueName: \"kubernetes.io/projected/d3433a2f-3af8-44f9-bb23-67dac303c015-kube-api-access-8w5zm\") pod \"d3433a2f-3af8-44f9-bb23-67dac303c015\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862968 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-client-ca\") pod \"d3433a2f-3af8-44f9-bb23-67dac303c015\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.863986 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-client-ca" (OuterVolumeSpecName: "client-ca") pod "d3433a2f-3af8-44f9-bb23-67dac303c015" (UID: "d3433a2f-3af8-44f9-bb23-67dac303c015"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.864313 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d3433a2f-3af8-44f9-bb23-67dac303c015" (UID: "d3433a2f-3af8-44f9-bb23-67dac303c015"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.864764 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-config" (OuterVolumeSpecName: "config") pod "3db4bc6b-389f-4ccf-b70e-eb97114f85e6" (UID: "3db4bc6b-389f-4ccf-b70e-eb97114f85e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.866494 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-client-ca" (OuterVolumeSpecName: "client-ca") pod "3db4bc6b-389f-4ccf-b70e-eb97114f85e6" (UID: "3db4bc6b-389f-4ccf-b70e-eb97114f85e6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.866595 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-config" (OuterVolumeSpecName: "config") pod "d3433a2f-3af8-44f9-bb23-67dac303c015" (UID: "d3433a2f-3af8-44f9-bb23-67dac303c015"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.870706 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3db4bc6b-389f-4ccf-b70e-eb97114f85e6" (UID: "3db4bc6b-389f-4ccf-b70e-eb97114f85e6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.874136 4994 scope.go:117] "RemoveContainer" containerID="d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6" Mar 10 00:14:15 crc kubenswrapper[4994]: E0310 00:14:15.875769 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6\": container with ID starting with d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6 not found: ID does not exist" containerID="d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.875981 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6"} err="failed to get container status \"d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6\": rpc error: code = NotFound desc = could not find container \"d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6\": container with ID starting with d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6 not found: ID does not exist" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.876027 4994 scope.go:117] "RemoveContainer" containerID="d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.877361 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3433a2f-3af8-44f9-bb23-67dac303c015-kube-api-access-8w5zm" (OuterVolumeSpecName: "kube-api-access-8w5zm") pod "d3433a2f-3af8-44f9-bb23-67dac303c015" (UID: "d3433a2f-3af8-44f9-bb23-67dac303c015"). InnerVolumeSpecName "kube-api-access-8w5zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.878384 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-kube-api-access-9vrzp" (OuterVolumeSpecName: "kube-api-access-9vrzp") pod "3db4bc6b-389f-4ccf-b70e-eb97114f85e6" (UID: "3db4bc6b-389f-4ccf-b70e-eb97114f85e6"). InnerVolumeSpecName "kube-api-access-9vrzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.886166 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3433a2f-3af8-44f9-bb23-67dac303c015-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d3433a2f-3af8-44f9-bb23-67dac303c015" (UID: "d3433a2f-3af8-44f9-bb23-67dac303c015"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.895517 4994 scope.go:117] "RemoveContainer" containerID="d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175" Mar 10 00:14:15 crc kubenswrapper[4994]: E0310 00:14:15.895994 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175\": container with ID starting with d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175 not found: ID does not exist" containerID="d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.896032 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175"} err="failed to get container status \"d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175\": rpc error: code = NotFound desc = could not find container \"d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175\": container with ID starting with d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175 not found: ID does not exist" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964772 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3433a2f-3af8-44f9-bb23-67dac303c015-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964803 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964814 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w5zm\" (UniqueName: \"kubernetes.io/projected/d3433a2f-3af8-44f9-bb23-67dac303c015-kube-api-access-8w5zm\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964822 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964829 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964838 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964888 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964897 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964904 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vrzp\" (UniqueName: \"kubernetes.io/projected/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-kube-api-access-9vrzp\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.000692 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxpkq"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.176426 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.180086 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.190932 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56bcc8b679-5m2f4"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.195212 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56bcc8b679-5m2f4"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551044 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7df6847577-hghv8"] Mar 10 00:14:16 crc kubenswrapper[4994]: E0310 00:14:16.551237 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3433a2f-3af8-44f9-bb23-67dac303c015" containerName="controller-manager" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551248 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3433a2f-3af8-44f9-bb23-67dac303c015" containerName="controller-manager" Mar 10 00:14:16 crc kubenswrapper[4994]: E0310 00:14:16.551262 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91ae1c5-3f03-4439-b579-b828884a1b58" containerName="oc" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551268 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91ae1c5-3f03-4439-b579-b828884a1b58" containerName="oc" Mar 10 00:14:16 crc kubenswrapper[4994]: E0310 00:14:16.551284 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db4bc6b-389f-4ccf-b70e-eb97114f85e6" containerName="route-controller-manager" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551290 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db4bc6b-389f-4ccf-b70e-eb97114f85e6" containerName="route-controller-manager" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551368 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3433a2f-3af8-44f9-bb23-67dac303c015" containerName="controller-manager" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551382 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db4bc6b-389f-4ccf-b70e-eb97114f85e6" containerName="route-controller-manager" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551395 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91ae1c5-3f03-4439-b579-b828884a1b58" containerName="oc" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551722 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.554042 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.554278 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.554280 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.554530 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.555339 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.557386 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.568694 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.572868 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db4bc6b-389f-4ccf-b70e-eb97114f85e6" path="/var/lib/kubelet/pods/3db4bc6b-389f-4ccf-b70e-eb97114f85e6/volumes" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.573800 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-config\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.573834 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de70537-1ea7-4305-9674-34a0f6493916-serving-cert\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.573903 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-proxy-ca-bundles\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.573935 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp8nj\" (UniqueName: \"kubernetes.io/projected/9de70537-1ea7-4305-9674-34a0f6493916-kube-api-access-zp8nj\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.573953 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-client-ca\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.574132 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3433a2f-3af8-44f9-bb23-67dac303c015" path="/var/lib/kubelet/pods/d3433a2f-3af8-44f9-bb23-67dac303c015/volumes" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.574924 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.578337 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7df6847577-hghv8"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.578377 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.578465 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.582377 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.582692 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.582722 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.583299 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.583546 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.583660 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.677825 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-config\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.677946 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de70537-1ea7-4305-9674-34a0f6493916-serving-cert\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678079 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-client-ca\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678114 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-proxy-ca-bundles\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678171 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-config\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678196 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9szqn\" (UniqueName: \"kubernetes.io/projected/515d1a06-2d73-4eb3-931c-add8a5c7940f-kube-api-access-9szqn\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678239 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/515d1a06-2d73-4eb3-931c-add8a5c7940f-serving-cert\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678261 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp8nj\" (UniqueName: \"kubernetes.io/projected/9de70537-1ea7-4305-9674-34a0f6493916-kube-api-access-zp8nj\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678290 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-client-ca\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.679281 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-client-ca\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.681095 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-config\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.683212 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-proxy-ca-bundles\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.696359 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de70537-1ea7-4305-9674-34a0f6493916-serving-cert\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.711142 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp8nj\" (UniqueName: \"kubernetes.io/projected/9de70537-1ea7-4305-9674-34a0f6493916-kube-api-access-zp8nj\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.778954 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-client-ca\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.779033 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-config\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.779064 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9szqn\" (UniqueName: \"kubernetes.io/projected/515d1a06-2d73-4eb3-931c-add8a5c7940f-kube-api-access-9szqn\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.779098 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/515d1a06-2d73-4eb3-931c-add8a5c7940f-serving-cert\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.779954 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-client-ca\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.781563 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-config\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.784478 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/515d1a06-2d73-4eb3-931c-add8a5c7940f-serving-cert\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.807484 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9szqn\" (UniqueName: \"kubernetes.io/projected/515d1a06-2d73-4eb3-931c-add8a5c7940f-kube-api-access-9szqn\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.882650 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.892795 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.071125 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt"] Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.110700 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7df6847577-hghv8"] Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.867774 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" event={"ID":"9de70537-1ea7-4305-9674-34a0f6493916","Type":"ContainerStarted","Data":"fa4f6aad3e2bd55dec5562201d04116607e7ec340427792efd6db5ffce0f6fa5"} Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.868012 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.868024 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" event={"ID":"9de70537-1ea7-4305-9674-34a0f6493916","Type":"ContainerStarted","Data":"c94e6296250c8576c6a3e6609eef102162e9c547d0081e5dae6fc85927973cb3"} Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.868802 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" event={"ID":"515d1a06-2d73-4eb3-931c-add8a5c7940f","Type":"ContainerStarted","Data":"a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1"} Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.868838 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" event={"ID":"515d1a06-2d73-4eb3-931c-add8a5c7940f","Type":"ContainerStarted","Data":"074e301a7c36bb2cd8c0f431623fca75cce6ab78360d5f6f9180331c04931148"} Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.869079 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.873304 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.873683 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.905733 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" podStartSLOduration=2.90572002 podStartE2EDuration="2.90572002s" podCreationTimestamp="2026-03-10 00:14:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:14:17.887044286 +0000 UTC m=+472.060751035" watchObservedRunningTime="2026-03-10 00:14:17.90572002 +0000 UTC m=+472.079426759" Mar 10 00:14:18 crc kubenswrapper[4994]: I0310 00:14:18.892947 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:14:18 crc kubenswrapper[4994]: I0310 00:14:18.893438 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.068384 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" podStartSLOduration=6.068356991 podStartE2EDuration="6.068356991s" podCreationTimestamp="2026-03-10 00:14:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:14:17.926386167 +0000 UTC m=+472.100092926" watchObservedRunningTime="2026-03-10 00:14:21.068356991 +0000 UTC m=+475.242063780" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.069068 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt"] Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.069372 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" podUID="515d1a06-2d73-4eb3-931c-add8a5c7940f" containerName="route-controller-manager" containerID="cri-o://a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1" gracePeriod=30 Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.509047 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.636166 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9szqn\" (UniqueName: \"kubernetes.io/projected/515d1a06-2d73-4eb3-931c-add8a5c7940f-kube-api-access-9szqn\") pod \"515d1a06-2d73-4eb3-931c-add8a5c7940f\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.636284 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/515d1a06-2d73-4eb3-931c-add8a5c7940f-serving-cert\") pod \"515d1a06-2d73-4eb3-931c-add8a5c7940f\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.636326 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-config\") pod \"515d1a06-2d73-4eb3-931c-add8a5c7940f\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.636441 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-client-ca\") pod \"515d1a06-2d73-4eb3-931c-add8a5c7940f\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.637438 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-client-ca" (OuterVolumeSpecName: "client-ca") pod "515d1a06-2d73-4eb3-931c-add8a5c7940f" (UID: "515d1a06-2d73-4eb3-931c-add8a5c7940f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.638057 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-config" (OuterVolumeSpecName: "config") pod "515d1a06-2d73-4eb3-931c-add8a5c7940f" (UID: "515d1a06-2d73-4eb3-931c-add8a5c7940f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.642945 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515d1a06-2d73-4eb3-931c-add8a5c7940f-kube-api-access-9szqn" (OuterVolumeSpecName: "kube-api-access-9szqn") pod "515d1a06-2d73-4eb3-931c-add8a5c7940f" (UID: "515d1a06-2d73-4eb3-931c-add8a5c7940f"). InnerVolumeSpecName "kube-api-access-9szqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.647818 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515d1a06-2d73-4eb3-931c-add8a5c7940f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "515d1a06-2d73-4eb3-931c-add8a5c7940f" (UID: "515d1a06-2d73-4eb3-931c-add8a5c7940f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.746653 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.746703 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9szqn\" (UniqueName: \"kubernetes.io/projected/515d1a06-2d73-4eb3-931c-add8a5c7940f-kube-api-access-9szqn\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.746718 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/515d1a06-2d73-4eb3-931c-add8a5c7940f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.746728 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.896678 4994 generic.go:334] "Generic (PLEG): container finished" podID="515d1a06-2d73-4eb3-931c-add8a5c7940f" containerID="a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1" exitCode=0 Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.896847 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.896915 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" event={"ID":"515d1a06-2d73-4eb3-931c-add8a5c7940f","Type":"ContainerDied","Data":"a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1"} Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.897782 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" event={"ID":"515d1a06-2d73-4eb3-931c-add8a5c7940f","Type":"ContainerDied","Data":"074e301a7c36bb2cd8c0f431623fca75cce6ab78360d5f6f9180331c04931148"} Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.897823 4994 scope.go:117] "RemoveContainer" containerID="a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.920412 4994 scope.go:117] "RemoveContainer" containerID="a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1" Mar 10 00:14:21 crc kubenswrapper[4994]: E0310 00:14:21.921047 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1\": container with ID starting with a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1 not found: ID does not exist" containerID="a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.921108 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1"} err="failed to get container status \"a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1\": rpc error: code = NotFound desc = could not find container \"a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1\": container with ID starting with a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1 not found: ID does not exist" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.953571 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt"] Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.961410 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt"] Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.560755 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515d1a06-2d73-4eb3-931c-add8a5c7940f" path="/var/lib/kubelet/pods/515d1a06-2d73-4eb3-931c-add8a5c7940f/volumes" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.561171 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4"] Mar 10 00:14:22 crc kubenswrapper[4994]: E0310 00:14:22.561348 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515d1a06-2d73-4eb3-931c-add8a5c7940f" containerName="route-controller-manager" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.561359 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="515d1a06-2d73-4eb3-931c-add8a5c7940f" containerName="route-controller-manager" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.561453 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="515d1a06-2d73-4eb3-931c-add8a5c7940f" containerName="route-controller-manager" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.564413 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.567579 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.567782 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.567833 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4"] Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.567843 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.567965 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.568048 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.568446 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.758100 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d43a41-1177-47b1-ac5f-3d4309491587-serving-cert\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.758200 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-client-ca\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.758258 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdtw\" (UniqueName: \"kubernetes.io/projected/d7d43a41-1177-47b1-ac5f-3d4309491587-kube-api-access-4tdtw\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.758360 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-config\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.859259 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d43a41-1177-47b1-ac5f-3d4309491587-serving-cert\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.859307 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-client-ca\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.859332 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdtw\" (UniqueName: \"kubernetes.io/projected/d7d43a41-1177-47b1-ac5f-3d4309491587-kube-api-access-4tdtw\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.859360 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-config\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.860501 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-config\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.860854 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-client-ca\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.869171 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d43a41-1177-47b1-ac5f-3d4309491587-serving-cert\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.878575 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdtw\" (UniqueName: \"kubernetes.io/projected/d7d43a41-1177-47b1-ac5f-3d4309491587-kube-api-access-4tdtw\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.891420 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:23 crc kubenswrapper[4994]: I0310 00:14:23.329676 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4"] Mar 10 00:14:23 crc kubenswrapper[4994]: I0310 00:14:23.916699 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" event={"ID":"d7d43a41-1177-47b1-ac5f-3d4309491587","Type":"ContainerStarted","Data":"92d6fcd0f4f7fb4ba9168b84d9692bf78d32d2b1f295642d2ecc92e5c84e1694"} Mar 10 00:14:23 crc kubenswrapper[4994]: I0310 00:14:23.917124 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" event={"ID":"d7d43a41-1177-47b1-ac5f-3d4309491587","Type":"ContainerStarted","Data":"bf8f57540b2c9250bb5294616cdfe2a18d71874683f0e928b618b3a35928461f"} Mar 10 00:14:23 crc kubenswrapper[4994]: I0310 00:14:23.918950 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:23 crc kubenswrapper[4994]: I0310 00:14:23.935156 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" podStartSLOduration=2.935130184 podStartE2EDuration="2.935130184s" podCreationTimestamp="2026-03-10 00:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:14:23.933285946 +0000 UTC m=+478.106992735" watchObservedRunningTime="2026-03-10 00:14:23.935130184 +0000 UTC m=+478.108836973" Mar 10 00:14:24 crc kubenswrapper[4994]: I0310 00:14:24.072234 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.034520 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" containerName="oauth-openshift" containerID="cri-o://8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3" gracePeriod=15 Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.518497 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.582120 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7c96dbfd49-scp27"] Mar 10 00:14:41 crc kubenswrapper[4994]: E0310 00:14:41.582382 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" containerName="oauth-openshift" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.582403 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" containerName="oauth-openshift" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.582565 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" containerName="oauth-openshift" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.583169 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.597931 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7c96dbfd49-scp27"] Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706292 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/903778b5-0c60-42d6-8773-a1345817fe1f-audit-dir\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706417 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/903778b5-0c60-42d6-8773-a1345817fe1f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706699 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-ocp-branding-template\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706753 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-service-ca\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706839 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-provider-selection\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706905 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-idp-0-file-data\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706986 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48dxj\" (UniqueName: \"kubernetes.io/projected/903778b5-0c60-42d6-8773-a1345817fe1f-kube-api-access-48dxj\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707052 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-session\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707111 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-trusted-ca-bundle\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707165 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-error\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707212 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-cliconfig\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707254 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-audit-policies\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707304 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-router-certs\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707347 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-login\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707382 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-serving-cert\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707596 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-login\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707652 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdwtj\" (UniqueName: \"kubernetes.io/projected/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-kube-api-access-vdwtj\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707714 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-audit-dir\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707749 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707846 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-audit-policies\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707925 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707969 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.708029 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.708065 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.708136 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.708188 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-error\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.708238 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.709014 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.709041 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.710227 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.710448 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.710584 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-session\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.710617 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.710846 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.710998 4994 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.711097 4994 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/903778b5-0c60-42d6-8773-a1345817fe1f-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.711123 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.714835 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/903778b5-0c60-42d6-8773-a1345817fe1f-kube-api-access-48dxj" (OuterVolumeSpecName: "kube-api-access-48dxj") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "kube-api-access-48dxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.716073 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.716143 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.719267 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.719688 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.721262 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.722013 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.722707 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.725375 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812313 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812366 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-error\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812396 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812416 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812445 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-session\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812477 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-login\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812500 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdwtj\" (UniqueName: \"kubernetes.io/projected/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-kube-api-access-vdwtj\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812523 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-audit-dir\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812542 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812568 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-audit-policies\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812590 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812616 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812643 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812666 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812718 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812734 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812747 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812759 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48dxj\" (UniqueName: \"kubernetes.io/projected/903778b5-0c60-42d6-8773-a1345817fe1f-kube-api-access-48dxj\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812771 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812785 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812798 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812812 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812824 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812837 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.813199 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-audit-dir\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.813943 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.815618 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.815748 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-audit-policies\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.816049 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-error\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.816638 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.819007 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.819055 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.819772 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.820036 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.820208 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.823304 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-session\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.823805 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-login\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.832284 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdwtj\" (UniqueName: \"kubernetes.io/projected/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-kube-api-access-vdwtj\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.919997 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.046575 4994 generic.go:334] "Generic (PLEG): container finished" podID="903778b5-0c60-42d6-8773-a1345817fe1f" containerID="8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3" exitCode=0 Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.047022 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" event={"ID":"903778b5-0c60-42d6-8773-a1345817fe1f","Type":"ContainerDied","Data":"8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3"} Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.047118 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" event={"ID":"903778b5-0c60-42d6-8773-a1345817fe1f","Type":"ContainerDied","Data":"579d8e47d1cae1e88f269db0e29bcd43ee29c56b451d1f988d01fa0b8de660ec"} Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.047680 4994 scope.go:117] "RemoveContainer" containerID="8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3" Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.049122 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.089183 4994 scope.go:117] "RemoveContainer" containerID="8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3" Mar 10 00:14:42 crc kubenswrapper[4994]: E0310 00:14:42.089629 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3\": container with ID starting with 8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3 not found: ID does not exist" containerID="8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3" Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.089654 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3"} err="failed to get container status \"8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3\": rpc error: code = NotFound desc = could not find container \"8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3\": container with ID starting with 8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3 not found: ID does not exist" Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.106031 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxpkq"] Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.111662 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxpkq"] Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.382732 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7c96dbfd49-scp27"] Mar 10 00:14:42 crc kubenswrapper[4994]: W0310 00:14:42.391204 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf56949c3_edcf_43cf_bb6f_a3e49d3fcb8b.slice/crio-03b27254f67e03c905777128a2a600d6ed7c7516c3082c89985fab6e592b8012 WatchSource:0}: Error finding container 03b27254f67e03c905777128a2a600d6ed7c7516c3082c89985fab6e592b8012: Status 404 returned error can't find the container with id 03b27254f67e03c905777128a2a600d6ed7c7516c3082c89985fab6e592b8012 Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.563563 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" path="/var/lib/kubelet/pods/903778b5-0c60-42d6-8773-a1345817fe1f/volumes" Mar 10 00:14:43 crc kubenswrapper[4994]: I0310 00:14:43.055736 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" event={"ID":"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b","Type":"ContainerStarted","Data":"e605f0ae01431c850277b1b39ca954ccb4e6f85a736f0d82e023f7d83dba93c3"} Mar 10 00:14:43 crc kubenswrapper[4994]: I0310 00:14:43.055814 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" event={"ID":"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b","Type":"ContainerStarted","Data":"03b27254f67e03c905777128a2a600d6ed7c7516c3082c89985fab6e592b8012"} Mar 10 00:14:43 crc kubenswrapper[4994]: I0310 00:14:43.056031 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:43 crc kubenswrapper[4994]: I0310 00:14:43.087432 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" podStartSLOduration=27.087397746 podStartE2EDuration="27.087397746s" podCreationTimestamp="2026-03-10 00:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:14:43.085305522 +0000 UTC m=+497.259012281" watchObservedRunningTime="2026-03-10 00:14:43.087397746 +0000 UTC m=+497.261104525" Mar 10 00:14:43 crc kubenswrapper[4994]: I0310 00:14:43.398371 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:48 crc kubenswrapper[4994]: I0310 00:14:48.892944 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:14:48 crc kubenswrapper[4994]: I0310 00:14:48.893766 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:14:48 crc kubenswrapper[4994]: I0310 00:14:48.893849 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:14:48 crc kubenswrapper[4994]: I0310 00:14:48.894806 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0758e06b050c72a5ba1ca15578add547da69884a82997478d49f051fd653d6f"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:14:48 crc kubenswrapper[4994]: I0310 00:14:48.894953 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://e0758e06b050c72a5ba1ca15578add547da69884a82997478d49f051fd653d6f" gracePeriod=600 Mar 10 00:14:49 crc kubenswrapper[4994]: I0310 00:14:49.109226 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="e0758e06b050c72a5ba1ca15578add547da69884a82997478d49f051fd653d6f" exitCode=0 Mar 10 00:14:49 crc kubenswrapper[4994]: I0310 00:14:49.109299 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"e0758e06b050c72a5ba1ca15578add547da69884a82997478d49f051fd653d6f"} Mar 10 00:14:49 crc kubenswrapper[4994]: I0310 00:14:49.109359 4994 scope.go:117] "RemoveContainer" containerID="345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6" Mar 10 00:14:50 crc kubenswrapper[4994]: I0310 00:14:50.119663 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"3a58b30808ba3fd3b4a9259ace5f595c7d1bd5910098d24eb4a7ede149499cfa"} Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.628843 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r27vp"] Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.630213 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.652199 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r27vp"] Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798583 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-registry-tls\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798638 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11d933f7-4a58-4a81-8916-647ed943e26d-registry-certificates\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798660 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11d933f7-4a58-4a81-8916-647ed943e26d-trusted-ca\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798680 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11d933f7-4a58-4a81-8916-647ed943e26d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798711 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-bound-sa-token\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798760 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11d933f7-4a58-4a81-8916-647ed943e26d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798791 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhbs5\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-kube-api-access-zhbs5\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798840 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.847648 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900157 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-bound-sa-token\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900214 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11d933f7-4a58-4a81-8916-647ed943e26d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900251 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhbs5\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-kube-api-access-zhbs5\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900333 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-registry-tls\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900364 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11d933f7-4a58-4a81-8916-647ed943e26d-registry-certificates\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900384 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11d933f7-4a58-4a81-8916-647ed943e26d-trusted-ca\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900404 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11d933f7-4a58-4a81-8916-647ed943e26d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.901537 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11d933f7-4a58-4a81-8916-647ed943e26d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.905704 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11d933f7-4a58-4a81-8916-647ed943e26d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.906435 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-registry-tls\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.906761 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11d933f7-4a58-4a81-8916-647ed943e26d-trusted-ca\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.907063 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11d933f7-4a58-4a81-8916-647ed943e26d-registry-certificates\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.918961 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-bound-sa-token\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.927198 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhbs5\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-kube-api-access-zhbs5\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.947847 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:53 crc kubenswrapper[4994]: I0310 00:14:53.414539 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r27vp"] Mar 10 00:14:53 crc kubenswrapper[4994]: W0310 00:14:53.425491 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11d933f7_4a58_4a81_8916_647ed943e26d.slice/crio-eac6e2a1bb3b63843499e6a6ff9a1655f2d3b0701f46affb8418c3cf678c8be4 WatchSource:0}: Error finding container eac6e2a1bb3b63843499e6a6ff9a1655f2d3b0701f46affb8418c3cf678c8be4: Status 404 returned error can't find the container with id eac6e2a1bb3b63843499e6a6ff9a1655f2d3b0701f46affb8418c3cf678c8be4 Mar 10 00:14:54 crc kubenswrapper[4994]: I0310 00:14:54.146631 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" event={"ID":"11d933f7-4a58-4a81-8916-647ed943e26d","Type":"ContainerStarted","Data":"55950f1339f17704e52315e7ae0d927736339fd0ab32d668f10cffaa923d280d"} Mar 10 00:14:54 crc kubenswrapper[4994]: I0310 00:14:54.146680 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" event={"ID":"11d933f7-4a58-4a81-8916-647ed943e26d","Type":"ContainerStarted","Data":"eac6e2a1bb3b63843499e6a6ff9a1655f2d3b0701f46affb8418c3cf678c8be4"} Mar 10 00:14:54 crc kubenswrapper[4994]: I0310 00:14:54.146772 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.133554 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" podStartSLOduration=8.13352918 podStartE2EDuration="8.13352918s" podCreationTimestamp="2026-03-10 00:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:14:54.171484605 +0000 UTC m=+508.345191354" watchObservedRunningTime="2026-03-10 00:15:00.13352918 +0000 UTC m=+514.307235969" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.136636 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv"] Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.137690 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.140290 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.143976 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.157995 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv"] Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.299595 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec6570de-1422-4eaf-a42d-dc5f62f91eba-secret-volume\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.299661 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec6570de-1422-4eaf-a42d-dc5f62f91eba-config-volume\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.299809 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmrfk\" (UniqueName: \"kubernetes.io/projected/ec6570de-1422-4eaf-a42d-dc5f62f91eba-kube-api-access-lmrfk\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.401097 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec6570de-1422-4eaf-a42d-dc5f62f91eba-secret-volume\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.401244 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec6570de-1422-4eaf-a42d-dc5f62f91eba-config-volume\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.401348 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmrfk\" (UniqueName: \"kubernetes.io/projected/ec6570de-1422-4eaf-a42d-dc5f62f91eba-kube-api-access-lmrfk\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.402925 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec6570de-1422-4eaf-a42d-dc5f62f91eba-config-volume\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.415432 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec6570de-1422-4eaf-a42d-dc5f62f91eba-secret-volume\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.422515 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmrfk\" (UniqueName: \"kubernetes.io/projected/ec6570de-1422-4eaf-a42d-dc5f62f91eba-kube-api-access-lmrfk\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.501183 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.983471 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv"] Mar 10 00:15:01 crc kubenswrapper[4994]: I0310 00:15:01.201691 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" event={"ID":"ec6570de-1422-4eaf-a42d-dc5f62f91eba","Type":"ContainerStarted","Data":"3e3524bdde5e5142c068a121dd13533f73b2944f4efabb9885c4bafd02497ae3"} Mar 10 00:15:01 crc kubenswrapper[4994]: I0310 00:15:01.202071 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" event={"ID":"ec6570de-1422-4eaf-a42d-dc5f62f91eba","Type":"ContainerStarted","Data":"27075e8e90509a20d0eb37e286e210f6b710fd6b7a61129393ecec9f6ef0c11e"} Mar 10 00:15:02 crc kubenswrapper[4994]: I0310 00:15:02.210777 4994 generic.go:334] "Generic (PLEG): container finished" podID="ec6570de-1422-4eaf-a42d-dc5f62f91eba" containerID="3e3524bdde5e5142c068a121dd13533f73b2944f4efabb9885c4bafd02497ae3" exitCode=0 Mar 10 00:15:02 crc kubenswrapper[4994]: I0310 00:15:02.210834 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" event={"ID":"ec6570de-1422-4eaf-a42d-dc5f62f91eba","Type":"ContainerDied","Data":"3e3524bdde5e5142c068a121dd13533f73b2944f4efabb9885c4bafd02497ae3"} Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.588966 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.754162 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec6570de-1422-4eaf-a42d-dc5f62f91eba-secret-volume\") pod \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.754341 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmrfk\" (UniqueName: \"kubernetes.io/projected/ec6570de-1422-4eaf-a42d-dc5f62f91eba-kube-api-access-lmrfk\") pod \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.754422 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec6570de-1422-4eaf-a42d-dc5f62f91eba-config-volume\") pod \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.756440 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec6570de-1422-4eaf-a42d-dc5f62f91eba-config-volume" (OuterVolumeSpecName: "config-volume") pod "ec6570de-1422-4eaf-a42d-dc5f62f91eba" (UID: "ec6570de-1422-4eaf-a42d-dc5f62f91eba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.764185 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6570de-1422-4eaf-a42d-dc5f62f91eba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ec6570de-1422-4eaf-a42d-dc5f62f91eba" (UID: "ec6570de-1422-4eaf-a42d-dc5f62f91eba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.766102 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec6570de-1422-4eaf-a42d-dc5f62f91eba-kube-api-access-lmrfk" (OuterVolumeSpecName: "kube-api-access-lmrfk") pod "ec6570de-1422-4eaf-a42d-dc5f62f91eba" (UID: "ec6570de-1422-4eaf-a42d-dc5f62f91eba"). InnerVolumeSpecName "kube-api-access-lmrfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.856543 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmrfk\" (UniqueName: \"kubernetes.io/projected/ec6570de-1422-4eaf-a42d-dc5f62f91eba-kube-api-access-lmrfk\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.856592 4994 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec6570de-1422-4eaf-a42d-dc5f62f91eba-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.856612 4994 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec6570de-1422-4eaf-a42d-dc5f62f91eba-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:04 crc kubenswrapper[4994]: I0310 00:15:04.223435 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" event={"ID":"ec6570de-1422-4eaf-a42d-dc5f62f91eba","Type":"ContainerDied","Data":"27075e8e90509a20d0eb37e286e210f6b710fd6b7a61129393ecec9f6ef0c11e"} Mar 10 00:15:04 crc kubenswrapper[4994]: I0310 00:15:04.223513 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27075e8e90509a20d0eb37e286e210f6b710fd6b7a61129393ecec9f6ef0c11e" Mar 10 00:15:04 crc kubenswrapper[4994]: I0310 00:15:04.223536 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.142707 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9dnvg"] Mar 10 00:15:07 crc kubenswrapper[4994]: E0310 00:15:07.144057 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec6570de-1422-4eaf-a42d-dc5f62f91eba" containerName="collect-profiles" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.144121 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6570de-1422-4eaf-a42d-dc5f62f91eba" containerName="collect-profiles" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.144482 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec6570de-1422-4eaf-a42d-dc5f62f91eba" containerName="collect-profiles" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.147046 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.150026 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.176727 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9dnvg"] Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.304584 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad4ae94a-f55f-4133-9b34-f95992f5454b-utilities\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.305100 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s8cb\" (UniqueName: \"kubernetes.io/projected/ad4ae94a-f55f-4133-9b34-f95992f5454b-kube-api-access-2s8cb\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.305422 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad4ae94a-f55f-4133-9b34-f95992f5454b-catalog-content\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.331688 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4vf56"] Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.333671 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.337115 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.348578 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4vf56"] Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.407227 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad4ae94a-f55f-4133-9b34-f95992f5454b-catalog-content\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.407335 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad4ae94a-f55f-4133-9b34-f95992f5454b-utilities\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.407433 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s8cb\" (UniqueName: \"kubernetes.io/projected/ad4ae94a-f55f-4133-9b34-f95992f5454b-kube-api-access-2s8cb\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.407933 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad4ae94a-f55f-4133-9b34-f95992f5454b-catalog-content\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.408177 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad4ae94a-f55f-4133-9b34-f95992f5454b-utilities\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.440227 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s8cb\" (UniqueName: \"kubernetes.io/projected/ad4ae94a-f55f-4133-9b34-f95992f5454b-kube-api-access-2s8cb\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.508670 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flst2\" (UniqueName: \"kubernetes.io/projected/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-kube-api-access-flst2\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.508774 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-catalog-content\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.509005 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-utilities\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.519519 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.610583 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flst2\" (UniqueName: \"kubernetes.io/projected/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-kube-api-access-flst2\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.610660 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-catalog-content\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.610822 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-utilities\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.611793 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-utilities\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.611907 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-catalog-content\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.642446 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flst2\" (UniqueName: \"kubernetes.io/projected/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-kube-api-access-flst2\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.659253 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.024971 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9dnvg"] Mar 10 00:15:08 crc kubenswrapper[4994]: W0310 00:15:08.032119 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad4ae94a_f55f_4133_9b34_f95992f5454b.slice/crio-3ae937bfe2b438c26596b4336eb3f80fb45022398ac81956b458fbde8a6c9a8f WatchSource:0}: Error finding container 3ae937bfe2b438c26596b4336eb3f80fb45022398ac81956b458fbde8a6c9a8f: Status 404 returned error can't find the container with id 3ae937bfe2b438c26596b4336eb3f80fb45022398ac81956b458fbde8a6c9a8f Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.116590 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4vf56"] Mar 10 00:15:08 crc kubenswrapper[4994]: W0310 00:15:08.124676 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a4e3a5e_5559_4e0b_a9b5_f117c0dcf105.slice/crio-f2fad8890475296348f668388e57267764dee5e2656aabba90c5c677efb49295 WatchSource:0}: Error finding container f2fad8890475296348f668388e57267764dee5e2656aabba90c5c677efb49295: Status 404 returned error can't find the container with id f2fad8890475296348f668388e57267764dee5e2656aabba90c5c677efb49295 Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.250507 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vf56" event={"ID":"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105","Type":"ContainerStarted","Data":"f2fad8890475296348f668388e57267764dee5e2656aabba90c5c677efb49295"} Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.251945 4994 generic.go:334] "Generic (PLEG): container finished" podID="ad4ae94a-f55f-4133-9b34-f95992f5454b" containerID="2c1e9079a493a5f62735a8cbb8e8de927eea69492fd29b516317758d01c362df" exitCode=0 Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.251982 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dnvg" event={"ID":"ad4ae94a-f55f-4133-9b34-f95992f5454b","Type":"ContainerDied","Data":"2c1e9079a493a5f62735a8cbb8e8de927eea69492fd29b516317758d01c362df"} Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.252003 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dnvg" event={"ID":"ad4ae94a-f55f-4133-9b34-f95992f5454b","Type":"ContainerStarted","Data":"3ae937bfe2b438c26596b4336eb3f80fb45022398ac81956b458fbde8a6c9a8f"} Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.253592 4994 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.257185 4994 generic.go:334] "Generic (PLEG): container finished" podID="2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105" containerID="f8b7a572ee09996595bda7b0a6810aa9c629ec20b78ffb9010fccfcdcbf17cf5" exitCode=0 Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.257447 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vf56" event={"ID":"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105","Type":"ContainerDied","Data":"f8b7a572ee09996595bda7b0a6810aa9c629ec20b78ffb9010fccfcdcbf17cf5"} Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.737392 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-45nlb"] Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.738928 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.741601 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.748296 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-45nlb"] Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.844708 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-catalog-content\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.844745 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-utilities\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.844811 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvmv9\" (UniqueName: \"kubernetes.io/projected/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-kube-api-access-hvmv9\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.926104 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2b884"] Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.927229 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.929958 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.945637 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-catalog-content\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.945746 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-utilities\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.946006 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvmv9\" (UniqueName: \"kubernetes.io/projected/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-kube-api-access-hvmv9\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.946278 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-catalog-content\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.946669 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-utilities\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.955972 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b884"] Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.974996 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvmv9\" (UniqueName: \"kubernetes.io/projected/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-kube-api-access-hvmv9\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.047045 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqxvd\" (UniqueName: \"kubernetes.io/projected/33f42081-c92d-42cd-90b5-329a5ae6c2ad-kube-api-access-cqxvd\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.047439 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-utilities\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.047493 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-catalog-content\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.059955 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.149053 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-utilities\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.149137 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-catalog-content\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.149219 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqxvd\" (UniqueName: \"kubernetes.io/projected/33f42081-c92d-42cd-90b5-329a5ae6c2ad-kube-api-access-cqxvd\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.150280 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-utilities\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.150653 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-catalog-content\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.176427 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqxvd\" (UniqueName: \"kubernetes.io/projected/33f42081-c92d-42cd-90b5-329a5ae6c2ad-kube-api-access-cqxvd\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.241108 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.266866 4994 generic.go:334] "Generic (PLEG): container finished" podID="ad4ae94a-f55f-4133-9b34-f95992f5454b" containerID="2f057d8119fc87624569ba8ca8ad6e525c8575e2620f5f49190927e9fe1fcdbc" exitCode=0 Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.266927 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dnvg" event={"ID":"ad4ae94a-f55f-4133-9b34-f95992f5454b","Type":"ContainerDied","Data":"2f057d8119fc87624569ba8ca8ad6e525c8575e2620f5f49190927e9fe1fcdbc"} Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.318717 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-45nlb"] Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.715101 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b884"] Mar 10 00:15:10 crc kubenswrapper[4994]: W0310 00:15:10.724799 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f42081_c92d_42cd_90b5_329a5ae6c2ad.slice/crio-2092117d1a588bddb0afa93402f75947e71f54b402d21d698eb1a68abf48a5d8 WatchSource:0}: Error finding container 2092117d1a588bddb0afa93402f75947e71f54b402d21d698eb1a68abf48a5d8: Status 404 returned error can't find the container with id 2092117d1a588bddb0afa93402f75947e71f54b402d21d698eb1a68abf48a5d8 Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.276850 4994 generic.go:334] "Generic (PLEG): container finished" podID="2aaa4876-9545-4d43-b7a3-02d53c8ef8f5" containerID="7ab894a2389f7d1bd80a6804cda0404c67ca01fc2a83e61e70b76f10cd2491ba" exitCode=0 Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.276990 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45nlb" event={"ID":"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5","Type":"ContainerDied","Data":"7ab894a2389f7d1bd80a6804cda0404c67ca01fc2a83e61e70b76f10cd2491ba"} Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.277029 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45nlb" event={"ID":"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5","Type":"ContainerStarted","Data":"b7250913fcf3d29c66ad62790938642d67f029fbfbe79948b71977cbe1ef12f3"} Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.280444 4994 generic.go:334] "Generic (PLEG): container finished" podID="2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105" containerID="78cce51b9958b4131f8c656983a4b40ced8f6b40d33f51d1467669721bcb10ea" exitCode=0 Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.280623 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vf56" event={"ID":"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105","Type":"ContainerDied","Data":"78cce51b9958b4131f8c656983a4b40ced8f6b40d33f51d1467669721bcb10ea"} Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.284370 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dnvg" event={"ID":"ad4ae94a-f55f-4133-9b34-f95992f5454b","Type":"ContainerStarted","Data":"0754e3199dca8df9639ade2c3f4f9de8b339d050a381cf423dc4366a663f8b81"} Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.286150 4994 generic.go:334] "Generic (PLEG): container finished" podID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerID="9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b" exitCode=0 Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.286183 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b884" event={"ID":"33f42081-c92d-42cd-90b5-329a5ae6c2ad","Type":"ContainerDied","Data":"9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b"} Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.286208 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b884" event={"ID":"33f42081-c92d-42cd-90b5-329a5ae6c2ad","Type":"ContainerStarted","Data":"2092117d1a588bddb0afa93402f75947e71f54b402d21d698eb1a68abf48a5d8"} Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.372891 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9dnvg" podStartSLOduration=1.830604906 podStartE2EDuration="4.372814849s" podCreationTimestamp="2026-03-10 00:15:07 +0000 UTC" firstStartedPulling="2026-03-10 00:15:08.253321745 +0000 UTC m=+522.427028504" lastFinishedPulling="2026-03-10 00:15:10.795531668 +0000 UTC m=+524.969238447" observedRunningTime="2026-03-10 00:15:11.367711655 +0000 UTC m=+525.541418444" watchObservedRunningTime="2026-03-10 00:15:11.372814849 +0000 UTC m=+525.546521638" Mar 10 00:15:12 crc kubenswrapper[4994]: I0310 00:15:12.300150 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45nlb" event={"ID":"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5","Type":"ContainerStarted","Data":"388b1069a6c755e8c923e733a205c8312af5821f43f85d06f4a0ca0e4bdd6f96"} Mar 10 00:15:12 crc kubenswrapper[4994]: I0310 00:15:12.305986 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vf56" event={"ID":"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105","Type":"ContainerStarted","Data":"9dd79697a8e5687f54bb0cc307dbf3e35bf4e0e64c6be831299048032444a299"} Mar 10 00:15:12 crc kubenswrapper[4994]: I0310 00:15:12.308978 4994 generic.go:334] "Generic (PLEG): container finished" podID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerID="d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a" exitCode=0 Mar 10 00:15:12 crc kubenswrapper[4994]: I0310 00:15:12.309841 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b884" event={"ID":"33f42081-c92d-42cd-90b5-329a5ae6c2ad","Type":"ContainerDied","Data":"d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a"} Mar 10 00:15:12 crc kubenswrapper[4994]: I0310 00:15:12.343803 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4vf56" podStartSLOduration=2.677153419 podStartE2EDuration="5.343783503s" podCreationTimestamp="2026-03-10 00:15:07 +0000 UTC" firstStartedPulling="2026-03-10 00:15:09.258431387 +0000 UTC m=+523.432138136" lastFinishedPulling="2026-03-10 00:15:11.925061441 +0000 UTC m=+526.098768220" observedRunningTime="2026-03-10 00:15:12.342003016 +0000 UTC m=+526.515709775" watchObservedRunningTime="2026-03-10 00:15:12.343783503 +0000 UTC m=+526.517490272" Mar 10 00:15:12 crc kubenswrapper[4994]: I0310 00:15:12.956776 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:15:13 crc kubenswrapper[4994]: I0310 00:15:13.028118 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-75h8c"] Mar 10 00:15:13 crc kubenswrapper[4994]: I0310 00:15:13.316401 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b884" event={"ID":"33f42081-c92d-42cd-90b5-329a5ae6c2ad","Type":"ContainerStarted","Data":"025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138"} Mar 10 00:15:13 crc kubenswrapper[4994]: I0310 00:15:13.317778 4994 generic.go:334] "Generic (PLEG): container finished" podID="2aaa4876-9545-4d43-b7a3-02d53c8ef8f5" containerID="388b1069a6c755e8c923e733a205c8312af5821f43f85d06f4a0ca0e4bdd6f96" exitCode=0 Mar 10 00:15:13 crc kubenswrapper[4994]: I0310 00:15:13.318457 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45nlb" event={"ID":"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5","Type":"ContainerDied","Data":"388b1069a6c755e8c923e733a205c8312af5821f43f85d06f4a0ca0e4bdd6f96"} Mar 10 00:15:13 crc kubenswrapper[4994]: I0310 00:15:13.337695 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2b884" podStartSLOduration=2.920161923 podStartE2EDuration="4.33767977s" podCreationTimestamp="2026-03-10 00:15:09 +0000 UTC" firstStartedPulling="2026-03-10 00:15:11.287811244 +0000 UTC m=+525.461518033" lastFinishedPulling="2026-03-10 00:15:12.705329131 +0000 UTC m=+526.879035880" observedRunningTime="2026-03-10 00:15:13.33656292 +0000 UTC m=+527.510269679" watchObservedRunningTime="2026-03-10 00:15:13.33767977 +0000 UTC m=+527.511386519" Mar 10 00:15:14 crc kubenswrapper[4994]: I0310 00:15:14.326422 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45nlb" event={"ID":"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5","Type":"ContainerStarted","Data":"a90c33c3ace798b84ccc4b92b44a44778e9d8f90bb203eb635ac96337b56d324"} Mar 10 00:15:14 crc kubenswrapper[4994]: I0310 00:15:14.349959 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-45nlb" podStartSLOduration=2.741832433 podStartE2EDuration="5.349933349s" podCreationTimestamp="2026-03-10 00:15:09 +0000 UTC" firstStartedPulling="2026-03-10 00:15:11.27969793 +0000 UTC m=+525.453404689" lastFinishedPulling="2026-03-10 00:15:13.887798816 +0000 UTC m=+528.061505605" observedRunningTime="2026-03-10 00:15:14.344719872 +0000 UTC m=+528.518426631" watchObservedRunningTime="2026-03-10 00:15:14.349933349 +0000 UTC m=+528.523640108" Mar 10 00:15:17 crc kubenswrapper[4994]: I0310 00:15:17.520511 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:17 crc kubenswrapper[4994]: I0310 00:15:17.520947 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:17 crc kubenswrapper[4994]: I0310 00:15:17.661148 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:17 crc kubenswrapper[4994]: I0310 00:15:17.661214 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:17 crc kubenswrapper[4994]: I0310 00:15:17.708229 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:18 crc kubenswrapper[4994]: I0310 00:15:18.416777 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:18 crc kubenswrapper[4994]: I0310 00:15:18.599407 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9dnvg" podUID="ad4ae94a-f55f-4133-9b34-f95992f5454b" containerName="registry-server" probeResult="failure" output=< Mar 10 00:15:18 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:15:18 crc kubenswrapper[4994]: > Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.062217 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.062302 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.107464 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.241692 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.241739 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.315811 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.445730 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.446452 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:21 crc kubenswrapper[4994]: I0310 00:15:21.066707 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4"] Mar 10 00:15:21 crc kubenswrapper[4994]: I0310 00:15:21.067021 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" podUID="d7d43a41-1177-47b1-ac5f-3d4309491587" containerName="route-controller-manager" containerID="cri-o://92d6fcd0f4f7fb4ba9168b84d9692bf78d32d2b1f295642d2ecc92e5c84e1694" gracePeriod=30 Mar 10 00:15:22 crc kubenswrapper[4994]: I0310 00:15:22.394922 4994 generic.go:334] "Generic (PLEG): container finished" podID="d7d43a41-1177-47b1-ac5f-3d4309491587" containerID="92d6fcd0f4f7fb4ba9168b84d9692bf78d32d2b1f295642d2ecc92e5c84e1694" exitCode=0 Mar 10 00:15:22 crc kubenswrapper[4994]: I0310 00:15:22.395194 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" event={"ID":"d7d43a41-1177-47b1-ac5f-3d4309491587","Type":"ContainerDied","Data":"92d6fcd0f4f7fb4ba9168b84d9692bf78d32d2b1f295642d2ecc92e5c84e1694"} Mar 10 00:15:22 crc kubenswrapper[4994]: I0310 00:15:22.971900 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.014637 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9"] Mar 10 00:15:23 crc kubenswrapper[4994]: E0310 00:15:23.015186 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d43a41-1177-47b1-ac5f-3d4309491587" containerName="route-controller-manager" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.015208 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d43a41-1177-47b1-ac5f-3d4309491587" containerName="route-controller-manager" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.015331 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d43a41-1177-47b1-ac5f-3d4309491587" containerName="route-controller-manager" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.015774 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.025295 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9"] Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.053801 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-client-ca\") pod \"d7d43a41-1177-47b1-ac5f-3d4309491587\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.053866 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-config\") pod \"d7d43a41-1177-47b1-ac5f-3d4309491587\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.054026 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d43a41-1177-47b1-ac5f-3d4309491587-serving-cert\") pod \"d7d43a41-1177-47b1-ac5f-3d4309491587\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.054075 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tdtw\" (UniqueName: \"kubernetes.io/projected/d7d43a41-1177-47b1-ac5f-3d4309491587-kube-api-access-4tdtw\") pod \"d7d43a41-1177-47b1-ac5f-3d4309491587\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.054922 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-config" (OuterVolumeSpecName: "config") pod "d7d43a41-1177-47b1-ac5f-3d4309491587" (UID: "d7d43a41-1177-47b1-ac5f-3d4309491587"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.055574 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-client-ca" (OuterVolumeSpecName: "client-ca") pod "d7d43a41-1177-47b1-ac5f-3d4309491587" (UID: "d7d43a41-1177-47b1-ac5f-3d4309491587"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.064117 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d43a41-1177-47b1-ac5f-3d4309491587-kube-api-access-4tdtw" (OuterVolumeSpecName: "kube-api-access-4tdtw") pod "d7d43a41-1177-47b1-ac5f-3d4309491587" (UID: "d7d43a41-1177-47b1-ac5f-3d4309491587"). InnerVolumeSpecName "kube-api-access-4tdtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.064135 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d43a41-1177-47b1-ac5f-3d4309491587-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d7d43a41-1177-47b1-ac5f-3d4309491587" (UID: "d7d43a41-1177-47b1-ac5f-3d4309491587"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156088 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-config\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156201 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxczt\" (UniqueName: \"kubernetes.io/projected/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-kube-api-access-qxczt\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156258 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-client-ca\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156387 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-serving-cert\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156570 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156619 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156652 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d43a41-1177-47b1-ac5f-3d4309491587-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156676 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tdtw\" (UniqueName: \"kubernetes.io/projected/d7d43a41-1177-47b1-ac5f-3d4309491587-kube-api-access-4tdtw\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.258532 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-config\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.258601 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxczt\" (UniqueName: \"kubernetes.io/projected/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-kube-api-access-qxczt\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.258639 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-client-ca\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.258676 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-serving-cert\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.260012 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-client-ca\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.260669 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-config\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.265034 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-serving-cert\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.289268 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxczt\" (UniqueName: \"kubernetes.io/projected/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-kube-api-access-qxczt\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.349025 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.406225 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" event={"ID":"d7d43a41-1177-47b1-ac5f-3d4309491587","Type":"ContainerDied","Data":"bf8f57540b2c9250bb5294616cdfe2a18d71874683f0e928b618b3a35928461f"} Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.406307 4994 scope.go:117] "RemoveContainer" containerID="92d6fcd0f4f7fb4ba9168b84d9692bf78d32d2b1f295642d2ecc92e5c84e1694" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.406325 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.456033 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4"] Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.460349 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4"] Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.651970 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9"] Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.892606 4994 patch_prober.go:28] interesting pod/route-controller-manager-84db66d99d-vvln4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.892709 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" podUID="d7d43a41-1177-47b1-ac5f-3d4309491587" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:15:24 crc kubenswrapper[4994]: I0310 00:15:24.415962 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" event={"ID":"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc","Type":"ContainerStarted","Data":"ef076e6b86a043fc0a26be4822c4501f0022e6ae7d4e4ffb20ac283bed8aba1f"} Mar 10 00:15:24 crc kubenswrapper[4994]: I0310 00:15:24.416032 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" event={"ID":"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc","Type":"ContainerStarted","Data":"812abc6e08175598e7ba80f628f3ccd1ad9da1302e824873176e499249c2fd09"} Mar 10 00:15:24 crc kubenswrapper[4994]: I0310 00:15:24.416259 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:24 crc kubenswrapper[4994]: I0310 00:15:24.448750 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" podStartSLOduration=3.44872114 podStartE2EDuration="3.44872114s" podCreationTimestamp="2026-03-10 00:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:15:24.443244036 +0000 UTC m=+538.616950825" watchObservedRunningTime="2026-03-10 00:15:24.44872114 +0000 UTC m=+538.622427919" Mar 10 00:15:24 crc kubenswrapper[4994]: I0310 00:15:24.497124 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:24 crc kubenswrapper[4994]: I0310 00:15:24.585190 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d43a41-1177-47b1-ac5f-3d4309491587" path="/var/lib/kubelet/pods/d7d43a41-1177-47b1-ac5f-3d4309491587/volumes" Mar 10 00:15:27 crc kubenswrapper[4994]: I0310 00:15:27.590045 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:27 crc kubenswrapper[4994]: I0310 00:15:27.698815 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.077144 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" podUID="295cba62-fd24-4245-8773-866ee134a29e" containerName="registry" containerID="cri-o://0b99026327a0246e8d6a6998d063d7da1dc8dded77c229f16aea5a63dc4137ba" gracePeriod=30 Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.514230 4994 generic.go:334] "Generic (PLEG): container finished" podID="295cba62-fd24-4245-8773-866ee134a29e" containerID="0b99026327a0246e8d6a6998d063d7da1dc8dded77c229f16aea5a63dc4137ba" exitCode=0 Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.514382 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" event={"ID":"295cba62-fd24-4245-8773-866ee134a29e","Type":"ContainerDied","Data":"0b99026327a0246e8d6a6998d063d7da1dc8dded77c229f16aea5a63dc4137ba"} Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.514723 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" event={"ID":"295cba62-fd24-4245-8773-866ee134a29e","Type":"ContainerDied","Data":"dd7db90166bf3d060ca8294e206e48fea14498e48c7b2ef7fe0c1e7d9d4dd09f"} Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.514746 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd7db90166bf3d060ca8294e206e48fea14498e48c7b2ef7fe0c1e7d9d4dd09f" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.542689 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.657644 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-bound-sa-token\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.657699 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/295cba62-fd24-4245-8773-866ee134a29e-installation-pull-secrets\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.657736 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-registry-tls\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.657770 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbrkc\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-kube-api-access-kbrkc\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.657824 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/295cba62-fd24-4245-8773-866ee134a29e-ca-trust-extracted\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.658798 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-trusted-ca\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.658829 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-registry-certificates\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.658969 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.659304 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.659393 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.661597 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.661618 4994 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.664136 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.664842 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.668035 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295cba62-fd24-4245-8773-866ee134a29e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.668383 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-kube-api-access-kbrkc" (OuterVolumeSpecName: "kube-api-access-kbrkc") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "kube-api-access-kbrkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.672430 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.674211 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295cba62-fd24-4245-8773-866ee134a29e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.762968 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbrkc\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-kube-api-access-kbrkc\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.763025 4994 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/295cba62-fd24-4245-8773-866ee134a29e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.763039 4994 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.763051 4994 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/295cba62-fd24-4245-8773-866ee134a29e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.763065 4994 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:39 crc kubenswrapper[4994]: I0310 00:15:39.521493 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:15:39 crc kubenswrapper[4994]: I0310 00:15:39.577066 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-75h8c"] Mar 10 00:15:39 crc kubenswrapper[4994]: I0310 00:15:39.584190 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-75h8c"] Mar 10 00:15:40 crc kubenswrapper[4994]: I0310 00:15:40.565503 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295cba62-fd24-4245-8773-866ee134a29e" path="/var/lib/kubelet/pods/295cba62-fd24-4245-8773-866ee134a29e/volumes" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.149775 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551696-w9ztg"] Mar 10 00:16:00 crc kubenswrapper[4994]: E0310 00:16:00.150791 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295cba62-fd24-4245-8773-866ee134a29e" containerName="registry" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.150816 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="295cba62-fd24-4245-8773-866ee134a29e" containerName="registry" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.151029 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="295cba62-fd24-4245-8773-866ee134a29e" containerName="registry" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.151630 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.154613 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.155421 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.156191 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.164671 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551696-w9ztg"] Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.242544 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtb5b\" (UniqueName: \"kubernetes.io/projected/f25bd204-3572-4880-b74f-764a5a3e0123-kube-api-access-mtb5b\") pod \"auto-csr-approver-29551696-w9ztg\" (UID: \"f25bd204-3572-4880-b74f-764a5a3e0123\") " pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.344295 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtb5b\" (UniqueName: \"kubernetes.io/projected/f25bd204-3572-4880-b74f-764a5a3e0123-kube-api-access-mtb5b\") pod \"auto-csr-approver-29551696-w9ztg\" (UID: \"f25bd204-3572-4880-b74f-764a5a3e0123\") " pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.379403 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtb5b\" (UniqueName: \"kubernetes.io/projected/f25bd204-3572-4880-b74f-764a5a3e0123-kube-api-access-mtb5b\") pod \"auto-csr-approver-29551696-w9ztg\" (UID: \"f25bd204-3572-4880-b74f-764a5a3e0123\") " pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.482171 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.973814 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551696-w9ztg"] Mar 10 00:16:01 crc kubenswrapper[4994]: I0310 00:16:01.676259 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" event={"ID":"f25bd204-3572-4880-b74f-764a5a3e0123","Type":"ContainerStarted","Data":"cc4874c70c7b37039156aae07da17b94f31ae1faae8e17c8685d374d2debf7f4"} Mar 10 00:16:02 crc kubenswrapper[4994]: I0310 00:16:02.685053 4994 generic.go:334] "Generic (PLEG): container finished" podID="f25bd204-3572-4880-b74f-764a5a3e0123" containerID="60b85c7f8cd24fb6dc7f7bf060fdb2ddff2c7fdefc6188b8ccedeba460b2b511" exitCode=0 Mar 10 00:16:02 crc kubenswrapper[4994]: I0310 00:16:02.685147 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" event={"ID":"f25bd204-3572-4880-b74f-764a5a3e0123","Type":"ContainerDied","Data":"60b85c7f8cd24fb6dc7f7bf060fdb2ddff2c7fdefc6188b8ccedeba460b2b511"} Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.098669 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.197024 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtb5b\" (UniqueName: \"kubernetes.io/projected/f25bd204-3572-4880-b74f-764a5a3e0123-kube-api-access-mtb5b\") pod \"f25bd204-3572-4880-b74f-764a5a3e0123\" (UID: \"f25bd204-3572-4880-b74f-764a5a3e0123\") " Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.204043 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25bd204-3572-4880-b74f-764a5a3e0123-kube-api-access-mtb5b" (OuterVolumeSpecName: "kube-api-access-mtb5b") pod "f25bd204-3572-4880-b74f-764a5a3e0123" (UID: "f25bd204-3572-4880-b74f-764a5a3e0123"). InnerVolumeSpecName "kube-api-access-mtb5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.298663 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtb5b\" (UniqueName: \"kubernetes.io/projected/f25bd204-3572-4880-b74f-764a5a3e0123-kube-api-access-mtb5b\") on node \"crc\" DevicePath \"\"" Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.702809 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" event={"ID":"f25bd204-3572-4880-b74f-764a5a3e0123","Type":"ContainerDied","Data":"cc4874c70c7b37039156aae07da17b94f31ae1faae8e17c8685d374d2debf7f4"} Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.703169 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc4874c70c7b37039156aae07da17b94f31ae1faae8e17c8685d374d2debf7f4" Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.702912 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:05 crc kubenswrapper[4994]: I0310 00:16:05.167255 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551690-7rbl8"] Mar 10 00:16:05 crc kubenswrapper[4994]: I0310 00:16:05.171913 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551690-7rbl8"] Mar 10 00:16:06 crc kubenswrapper[4994]: I0310 00:16:06.562588 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04aae5d-b067-4e49-82f3-66412ec1bba6" path="/var/lib/kubelet/pods/f04aae5d-b067-4e49-82f3-66412ec1bba6/volumes" Mar 10 00:16:11 crc kubenswrapper[4994]: I0310 00:16:11.534391 4994 scope.go:117] "RemoveContainer" containerID="669d56e78759519de5a6dd239e9cf24e944424e3eb64de05a26d842e32401407" Mar 10 00:16:11 crc kubenswrapper[4994]: I0310 00:16:11.583692 4994 scope.go:117] "RemoveContainer" containerID="ee78e5054ad5ad8a035342e7024985079f992d0c77022319d8f1e7f3d55f9eb9" Mar 10 00:17:11 crc kubenswrapper[4994]: I0310 00:17:11.697666 4994 scope.go:117] "RemoveContainer" containerID="0b99026327a0246e8d6a6998d063d7da1dc8dded77c229f16aea5a63dc4137ba" Mar 10 00:17:18 crc kubenswrapper[4994]: I0310 00:17:18.893127 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:17:18 crc kubenswrapper[4994]: I0310 00:17:18.893510 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:17:48 crc kubenswrapper[4994]: I0310 00:17:48.893300 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:17:48 crc kubenswrapper[4994]: I0310 00:17:48.893999 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.146779 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551698-2d6g5"] Mar 10 00:18:00 crc kubenswrapper[4994]: E0310 00:18:00.147863 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25bd204-3572-4880-b74f-764a5a3e0123" containerName="oc" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.147923 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25bd204-3572-4880-b74f-764a5a3e0123" containerName="oc" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.148112 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25bd204-3572-4880-b74f-764a5a3e0123" containerName="oc" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.148750 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.152226 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.154372 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.156695 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551698-2d6g5"] Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.193779 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.246993 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnq9s\" (UniqueName: \"kubernetes.io/projected/6471fd89-1c92-498d-ba15-149418259c58-kube-api-access-hnq9s\") pod \"auto-csr-approver-29551698-2d6g5\" (UID: \"6471fd89-1c92-498d-ba15-149418259c58\") " pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.348648 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnq9s\" (UniqueName: \"kubernetes.io/projected/6471fd89-1c92-498d-ba15-149418259c58-kube-api-access-hnq9s\") pod \"auto-csr-approver-29551698-2d6g5\" (UID: \"6471fd89-1c92-498d-ba15-149418259c58\") " pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.382932 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnq9s\" (UniqueName: \"kubernetes.io/projected/6471fd89-1c92-498d-ba15-149418259c58-kube-api-access-hnq9s\") pod \"auto-csr-approver-29551698-2d6g5\" (UID: \"6471fd89-1c92-498d-ba15-149418259c58\") " pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.520405 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.757411 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551698-2d6g5"] Mar 10 00:18:01 crc kubenswrapper[4994]: I0310 00:18:01.213061 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" event={"ID":"6471fd89-1c92-498d-ba15-149418259c58","Type":"ContainerStarted","Data":"4a8734c3105aac95c1dec5df57a7a79d1a3b6f1e13fd44aae694240c430ec8c2"} Mar 10 00:18:03 crc kubenswrapper[4994]: I0310 00:18:03.245110 4994 generic.go:334] "Generic (PLEG): container finished" podID="6471fd89-1c92-498d-ba15-149418259c58" containerID="b02eba80da92e156810a460cd2fd2b2fbae8ce74141ff71d34b4fc6b8bc7db3f" exitCode=0 Mar 10 00:18:03 crc kubenswrapper[4994]: I0310 00:18:03.245172 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" event={"ID":"6471fd89-1c92-498d-ba15-149418259c58","Type":"ContainerDied","Data":"b02eba80da92e156810a460cd2fd2b2fbae8ce74141ff71d34b4fc6b8bc7db3f"} Mar 10 00:18:04 crc kubenswrapper[4994]: I0310 00:18:04.570851 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:04 crc kubenswrapper[4994]: I0310 00:18:04.704575 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnq9s\" (UniqueName: \"kubernetes.io/projected/6471fd89-1c92-498d-ba15-149418259c58-kube-api-access-hnq9s\") pod \"6471fd89-1c92-498d-ba15-149418259c58\" (UID: \"6471fd89-1c92-498d-ba15-149418259c58\") " Mar 10 00:18:04 crc kubenswrapper[4994]: I0310 00:18:04.719950 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6471fd89-1c92-498d-ba15-149418259c58-kube-api-access-hnq9s" (OuterVolumeSpecName: "kube-api-access-hnq9s") pod "6471fd89-1c92-498d-ba15-149418259c58" (UID: "6471fd89-1c92-498d-ba15-149418259c58"). InnerVolumeSpecName "kube-api-access-hnq9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:18:04 crc kubenswrapper[4994]: I0310 00:18:04.806713 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnq9s\" (UniqueName: \"kubernetes.io/projected/6471fd89-1c92-498d-ba15-149418259c58-kube-api-access-hnq9s\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:05 crc kubenswrapper[4994]: I0310 00:18:05.260691 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" event={"ID":"6471fd89-1c92-498d-ba15-149418259c58","Type":"ContainerDied","Data":"4a8734c3105aac95c1dec5df57a7a79d1a3b6f1e13fd44aae694240c430ec8c2"} Mar 10 00:18:05 crc kubenswrapper[4994]: I0310 00:18:05.260751 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a8734c3105aac95c1dec5df57a7a79d1a3b6f1e13fd44aae694240c430ec8c2" Mar 10 00:18:05 crc kubenswrapper[4994]: I0310 00:18:05.260771 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:05 crc kubenswrapper[4994]: I0310 00:18:05.645427 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551692-29hls"] Mar 10 00:18:05 crc kubenswrapper[4994]: I0310 00:18:05.652581 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551692-29hls"] Mar 10 00:18:06 crc kubenswrapper[4994]: I0310 00:18:06.562481 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a70a0f-0e78-4f55-9eee-62099acf734d" path="/var/lib/kubelet/pods/24a70a0f-0e78-4f55-9eee-62099acf734d/volumes" Mar 10 00:18:11 crc kubenswrapper[4994]: I0310 00:18:11.750182 4994 scope.go:117] "RemoveContainer" containerID="f159c5160bbebc2d55cd9bb33ea390e800dbff7ea9620c436631139ca88c6b3b" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.490015 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ns797"] Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491249 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-controller" containerID="cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491306 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="nbdb" containerID="cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491404 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-node" containerID="cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491491 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-acl-logging" containerID="cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491509 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="sbdb" containerID="cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491428 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491428 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="northd" containerID="cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.543227 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" containerID="cri-o://69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.838307 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/3.log" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.842084 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovn-acl-logging/0.log" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.843659 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovn-controller/0.log" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.844409 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.899962 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.900158 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.901148 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.903528 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a58b30808ba3fd3b4a9259ace5f595c7d1bd5910098d24eb4a7ede149499cfa"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.903676 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://3a58b30808ba3fd3b4a9259ace5f595c7d1bd5910098d24eb4a7ede149499cfa" gracePeriod=600 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918227 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918110 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-var-lib-openvswitch\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918396 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918524 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-bin\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918600 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-etc-openvswitch\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918644 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-slash\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918678 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-systemd-units\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918728 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s42gc\" (UniqueName: \"kubernetes.io/projected/72a13a81-4c11-4529-8a3d-2dd3c73215a7-kube-api-access-s42gc\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918765 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-env-overrides\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918799 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-netns\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918837 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-ovn-kubernetes\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918865 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-ovn\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918920 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovn-node-metrics-cert\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918960 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-openvswitch\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919023 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-script-lib\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919062 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-netd\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919100 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-systemd\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919141 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-kubelet\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919187 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-config\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919226 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-node-log\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919301 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-log-socket\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919743 4994 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918476 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919978 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.920049 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.920063 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.920110 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-slash" (OuterVolumeSpecName: "host-slash") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.920084 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.920962 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921071 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921540 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921113 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921309 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-log-socket" (OuterVolumeSpecName: "log-socket") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921610 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-node-log" (OuterVolumeSpecName: "node-log") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921657 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921683 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921946 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.922009 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.929798 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a13a81-4c11-4529-8a3d-2dd3c73215a7-kube-api-access-s42gc" (OuterVolumeSpecName: "kube-api-access-s42gc") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "kube-api-access-s42gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.933230 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.941782 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d5dml"] Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942378 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6471fd89-1c92-498d-ba15-149418259c58" containerName="oc" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942424 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6471fd89-1c92-498d-ba15-149418259c58" containerName="oc" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942450 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-node" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942468 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-node" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942492 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="nbdb" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942510 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="nbdb" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942537 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="northd" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942554 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="northd" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942581 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942601 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942624 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942642 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942666 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kubecfg-setup" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942683 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kubecfg-setup" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942706 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942726 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942752 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942770 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942792 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="sbdb" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942809 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="sbdb" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942837 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942855 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942969 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-acl-logging" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942992 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-acl-logging" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943246 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943285 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943309 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="northd" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943335 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-node" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943354 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943370 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943395 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="6471fd89-1c92-498d-ba15-149418259c58" containerName="oc" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943419 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-acl-logging" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943443 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943464 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="nbdb" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943486 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943505 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="sbdb" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.943779 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943805 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.943836 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943854 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.944067 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.947123 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.957588 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.020834 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.020956 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n4db\" (UniqueName: \"kubernetes.io/projected/43e9ccbc-21ed-4371-8fde-cd3728441d1e-kube-api-access-5n4db\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021033 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-env-overrides\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021158 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovnkube-script-lib\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021212 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-cni-bin\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021297 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-systemd\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021349 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-etc-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021425 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-node-log\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021486 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovnkube-config\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021530 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-run-netns\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021577 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-ovn\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021623 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-kubelet\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021739 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovn-node-metrics-cert\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021812 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021902 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-cni-netd\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021971 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-slash\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022057 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-var-lib-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022123 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022186 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-systemd-units\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022239 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-log-socket\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022344 4994 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022386 4994 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-slash\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022414 4994 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022440 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s42gc\" (UniqueName: \"kubernetes.io/projected/72a13a81-4c11-4529-8a3d-2dd3c73215a7-kube-api-access-s42gc\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022467 4994 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022494 4994 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022522 4994 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022546 4994 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022571 4994 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022595 4994 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022619 4994 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022645 4994 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022671 4994 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022796 4994 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022849 4994 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022909 4994 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-node-log\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022938 4994 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-log-socket\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022968 4994 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022988 4994 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124739 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-systemd\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124798 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-etc-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124837 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-node-log\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124900 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovnkube-config\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124931 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-run-netns\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124967 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-ovn\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124989 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-node-log\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125007 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-kubelet\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124978 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-systemd\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125035 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-run-netns\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125094 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovn-node-metrics-cert\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125112 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-ovn\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125137 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-kubelet\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125158 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125215 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125257 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-cni-netd\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125323 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-slash\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125420 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-var-lib-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125472 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-slash\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125482 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125552 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125611 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-systemd-units\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125625 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-var-lib-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125424 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-cni-netd\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125557 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-systemd-units\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125721 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-log-socket\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125773 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125813 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-log-socket\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125826 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n4db\" (UniqueName: \"kubernetes.io/projected/43e9ccbc-21ed-4371-8fde-cd3728441d1e-kube-api-access-5n4db\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125949 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-env-overrides\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.126037 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovnkube-script-lib\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.126087 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-cni-bin\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.126206 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-cni-bin\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.126221 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.126497 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovnkube-config\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.126602 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-etc-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.127125 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-env-overrides\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.128339 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovnkube-script-lib\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.134782 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovn-node-metrics-cert\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.155623 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n4db\" (UniqueName: \"kubernetes.io/projected/43e9ccbc-21ed-4371-8fde-cd3728441d1e-kube-api-access-5n4db\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.276013 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: W0310 00:18:19.307151 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43e9ccbc_21ed_4371_8fde_cd3728441d1e.slice/crio-5bb57f5d24154ee36a367a988551ee8d76bdccc9de1caa9dee5df00900d09885 WatchSource:0}: Error finding container 5bb57f5d24154ee36a367a988551ee8d76bdccc9de1caa9dee5df00900d09885: Status 404 returned error can't find the container with id 5bb57f5d24154ee36a367a988551ee8d76bdccc9de1caa9dee5df00900d09885 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.391264 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"5bb57f5d24154ee36a367a988551ee8d76bdccc9de1caa9dee5df00900d09885"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.396897 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="3a58b30808ba3fd3b4a9259ace5f595c7d1bd5910098d24eb4a7ede149499cfa" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.397097 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"3a58b30808ba3fd3b4a9259ace5f595c7d1bd5910098d24eb4a7ede149499cfa"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.397353 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.397447 4994 scope.go:117] "RemoveContainer" containerID="e0758e06b050c72a5ba1ca15578add547da69884a82997478d49f051fd653d6f" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.401595 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/2.log" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.405139 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/1.log" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.405235 4994 generic.go:334] "Generic (PLEG): container finished" podID="6dac87a5-07eb-488d-85fe-cb8848434ae5" containerID="d5d956a023e0ec491c4113a78f5143267fd1a96d627be08eb78b078d22c69f89" exitCode=2 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.405374 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerDied","Data":"d5d956a023e0ec491c4113a78f5143267fd1a96d627be08eb78b078d22c69f89"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.406391 4994 scope.go:117] "RemoveContainer" containerID="d5d956a023e0ec491c4113a78f5143267fd1a96d627be08eb78b078d22c69f89" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.406960 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mcxcb_openshift-multus(6dac87a5-07eb-488d-85fe-cb8848434ae5)\"" pod="openshift-multus/multus-mcxcb" podUID="6dac87a5-07eb-488d-85fe-cb8848434ae5" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.409632 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/3.log" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.414178 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovn-acl-logging/0.log" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.415051 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovn-controller/0.log" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.415784 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.415865 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.415948 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416008 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416069 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416126 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416187 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" exitCode=143 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416248 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" exitCode=143 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.415818 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416388 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.415962 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416454 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416638 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416682 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416705 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416727 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416744 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416756 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416767 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416778 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416803 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416815 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416827 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416840 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416851 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416863 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416940 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416958 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416972 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416983 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416994 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417004 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417015 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417025 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417036 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417046 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417057 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417071 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417088 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417101 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417111 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417122 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417132 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417142 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417153 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417163 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417173 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417186 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417199 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"b8da989a4363394b0ce6c6c658409a1fc0f3d0c82d2ec6c0704e2ab145277cf7"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417215 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417228 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417240 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417251 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417262 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417273 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417284 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417294 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417305 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417315 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.468794 4994 scope.go:117] "RemoveContainer" containerID="04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.486439 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ns797"] Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.495306 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ns797"] Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.500350 4994 scope.go:117] "RemoveContainer" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.570712 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.593549 4994 scope.go:117] "RemoveContainer" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.613421 4994 scope.go:117] "RemoveContainer" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.666262 4994 scope.go:117] "RemoveContainer" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.685094 4994 scope.go:117] "RemoveContainer" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.700604 4994 scope.go:117] "RemoveContainer" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.717228 4994 scope.go:117] "RemoveContainer" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.732758 4994 scope.go:117] "RemoveContainer" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.747960 4994 scope.go:117] "RemoveContainer" containerID="dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.769291 4994 scope.go:117] "RemoveContainer" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.769850 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": container with ID starting with 69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9 not found: ID does not exist" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.769926 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} err="failed to get container status \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": rpc error: code = NotFound desc = could not find container \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": container with ID starting with 69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.769961 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.770585 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": container with ID starting with c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e not found: ID does not exist" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.770627 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} err="failed to get container status \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": rpc error: code = NotFound desc = could not find container \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": container with ID starting with c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.770654 4994 scope.go:117] "RemoveContainer" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.771168 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": container with ID starting with 9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1 not found: ID does not exist" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.771335 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} err="failed to get container status \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": rpc error: code = NotFound desc = could not find container \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": container with ID starting with 9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.771427 4994 scope.go:117] "RemoveContainer" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.772097 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": container with ID starting with 80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c not found: ID does not exist" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.772139 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} err="failed to get container status \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": rpc error: code = NotFound desc = could not find container \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": container with ID starting with 80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.772168 4994 scope.go:117] "RemoveContainer" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.772719 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": container with ID starting with d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f not found: ID does not exist" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.772760 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} err="failed to get container status \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": rpc error: code = NotFound desc = could not find container \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": container with ID starting with d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.772787 4994 scope.go:117] "RemoveContainer" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.773080 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": container with ID starting with 922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85 not found: ID does not exist" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.773124 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} err="failed to get container status \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": rpc error: code = NotFound desc = could not find container \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": container with ID starting with 922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.773154 4994 scope.go:117] "RemoveContainer" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.773551 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": container with ID starting with f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b not found: ID does not exist" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.773591 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} err="failed to get container status \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": rpc error: code = NotFound desc = could not find container \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": container with ID starting with f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.773618 4994 scope.go:117] "RemoveContainer" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.774018 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": container with ID starting with 9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700 not found: ID does not exist" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.774057 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} err="failed to get container status \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": rpc error: code = NotFound desc = could not find container \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": container with ID starting with 9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.774082 4994 scope.go:117] "RemoveContainer" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.774427 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": container with ID starting with ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd not found: ID does not exist" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.774467 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} err="failed to get container status \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": rpc error: code = NotFound desc = could not find container \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": container with ID starting with ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.774494 4994 scope.go:117] "RemoveContainer" containerID="dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.774836 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": container with ID starting with dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6 not found: ID does not exist" containerID="dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.774925 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} err="failed to get container status \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": rpc error: code = NotFound desc = could not find container \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": container with ID starting with dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.774963 4994 scope.go:117] "RemoveContainer" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.775334 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} err="failed to get container status \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": rpc error: code = NotFound desc = could not find container \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": container with ID starting with 69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.775387 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.775783 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} err="failed to get container status \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": rpc error: code = NotFound desc = could not find container \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": container with ID starting with c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.775820 4994 scope.go:117] "RemoveContainer" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.776148 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} err="failed to get container status \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": rpc error: code = NotFound desc = could not find container \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": container with ID starting with 9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.776175 4994 scope.go:117] "RemoveContainer" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.776431 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} err="failed to get container status \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": rpc error: code = NotFound desc = could not find container \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": container with ID starting with 80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.776465 4994 scope.go:117] "RemoveContainer" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.776756 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} err="failed to get container status \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": rpc error: code = NotFound desc = could not find container \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": container with ID starting with d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.776791 4994 scope.go:117] "RemoveContainer" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.777079 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} err="failed to get container status \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": rpc error: code = NotFound desc = could not find container \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": container with ID starting with 922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.777132 4994 scope.go:117] "RemoveContainer" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.777512 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} err="failed to get container status \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": rpc error: code = NotFound desc = could not find container \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": container with ID starting with f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.777549 4994 scope.go:117] "RemoveContainer" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.777835 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} err="failed to get container status \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": rpc error: code = NotFound desc = could not find container \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": container with ID starting with 9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.777869 4994 scope.go:117] "RemoveContainer" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.778285 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} err="failed to get container status \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": rpc error: code = NotFound desc = could not find container \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": container with ID starting with ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.778317 4994 scope.go:117] "RemoveContainer" containerID="dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.778602 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} err="failed to get container status \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": rpc error: code = NotFound desc = could not find container \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": container with ID starting with dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.778634 4994 scope.go:117] "RemoveContainer" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.778947 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} err="failed to get container status \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": rpc error: code = NotFound desc = could not find container \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": container with ID starting with 69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.778982 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.779275 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} err="failed to get container status \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": rpc error: code = NotFound desc = could not find container \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": container with ID starting with c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.779338 4994 scope.go:117] "RemoveContainer" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.779629 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} err="failed to get container status \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": rpc error: code = NotFound desc = could not find container \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": container with ID starting with 9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.779661 4994 scope.go:117] "RemoveContainer" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.780023 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} err="failed to get container status \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": rpc error: code = NotFound desc = could not find container \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": container with ID starting with 80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.780065 4994 scope.go:117] "RemoveContainer" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.780380 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} err="failed to get container status \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": rpc error: code = NotFound desc = could not find container \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": container with ID starting with d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.780415 4994 scope.go:117] "RemoveContainer" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.780725 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} err="failed to get container status \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": rpc error: code = NotFound desc = could not find container \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": container with ID starting with 922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.780852 4994 scope.go:117] "RemoveContainer" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.781184 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} err="failed to get container status \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": rpc error: code = NotFound desc = could not find container \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": container with ID starting with f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.781232 4994 scope.go:117] "RemoveContainer" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.781602 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} err="failed to get container status \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": rpc error: code = NotFound desc = could not find container \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": container with ID starting with 9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.781656 4994 scope.go:117] "RemoveContainer" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.782073 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} err="failed to get container status \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": rpc error: code = NotFound desc = could not find container \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": container with ID starting with ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.782143 4994 scope.go:117] "RemoveContainer" containerID="dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.782481 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} err="failed to get container status \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": rpc error: code = NotFound desc = could not find container \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": container with ID starting with dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.782530 4994 scope.go:117] "RemoveContainer" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.783006 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} err="failed to get container status \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": rpc error: code = NotFound desc = could not find container \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": container with ID starting with 69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.783061 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.783483 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} err="failed to get container status \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": rpc error: code = NotFound desc = could not find container \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": container with ID starting with c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.783521 4994 scope.go:117] "RemoveContainer" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.784134 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} err="failed to get container status \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": rpc error: code = NotFound desc = could not find container \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": container with ID starting with 9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.784180 4994 scope.go:117] "RemoveContainer" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.784701 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} err="failed to get container status \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": rpc error: code = NotFound desc = could not find container \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": container with ID starting with 80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.784753 4994 scope.go:117] "RemoveContainer" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.785323 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} err="failed to get container status \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": rpc error: code = NotFound desc = could not find container \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": container with ID starting with d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.785347 4994 scope.go:117] "RemoveContainer" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.785701 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} err="failed to get container status \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": rpc error: code = NotFound desc = could not find container \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": container with ID starting with 922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.785742 4994 scope.go:117] "RemoveContainer" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.786083 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} err="failed to get container status \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": rpc error: code = NotFound desc = could not find container \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": container with ID starting with f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.786108 4994 scope.go:117] "RemoveContainer" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.786419 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} err="failed to get container status \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": rpc error: code = NotFound desc = could not find container \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": container with ID starting with 9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.786470 4994 scope.go:117] "RemoveContainer" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.786783 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} err="failed to get container status \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": rpc error: code = NotFound desc = could not find container \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": container with ID starting with ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd not found: ID does not exist" Mar 10 00:18:20 crc kubenswrapper[4994]: I0310 00:18:20.430487 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/2.log" Mar 10 00:18:20 crc kubenswrapper[4994]: I0310 00:18:20.436396 4994 generic.go:334] "Generic (PLEG): container finished" podID="43e9ccbc-21ed-4371-8fde-cd3728441d1e" containerID="7c9e2f1a80a6409196282c7bb495c555806137480f8f7bcf8d441fd872a4edff" exitCode=0 Mar 10 00:18:20 crc kubenswrapper[4994]: I0310 00:18:20.436452 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerDied","Data":"7c9e2f1a80a6409196282c7bb495c555806137480f8f7bcf8d441fd872a4edff"} Mar 10 00:18:20 crc kubenswrapper[4994]: I0310 00:18:20.568195 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" path="/var/lib/kubelet/pods/72a13a81-4c11-4529-8a3d-2dd3c73215a7/volumes" Mar 10 00:18:21 crc kubenswrapper[4994]: I0310 00:18:21.444763 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"8c933bc21d307a9ac6a37d4b3ea905a4d8acaa7233ebeadadaf6c9e24814262d"} Mar 10 00:18:21 crc kubenswrapper[4994]: I0310 00:18:21.445020 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"30fb89fc99ef3646d6a83219d5ce24d8db0963cdce1a452324bf0f9fbca26a8a"} Mar 10 00:18:21 crc kubenswrapper[4994]: I0310 00:18:21.445052 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"fb1cb8d837ed458e2c85e23928d18be2cac1a157c11de004378c33273389aa39"} Mar 10 00:18:21 crc kubenswrapper[4994]: I0310 00:18:21.445062 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"3e72f53f12212178b7de7534a4ff49f0fcb7820be27a1f8a1c7e666ebb545ec8"} Mar 10 00:18:21 crc kubenswrapper[4994]: I0310 00:18:21.445073 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"da4c02cb7f63028d262af9e65b97efacf94be0c058afe29dc17595a73977193c"} Mar 10 00:18:22 crc kubenswrapper[4994]: I0310 00:18:22.458422 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"be4aefd7f943184d51b01c6f8062d8bc9cc3336ab78d5c62b309198209983bcd"} Mar 10 00:18:24 crc kubenswrapper[4994]: I0310 00:18:24.478383 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"854352bac2746c64e929519cda8ed0d737bd0e49d96556d49bd53ccdb7f5c968"} Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.495669 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"1286a5909c570e5cdc4cfcbb1418a23628d9a1132f146abe27510f0b12cf4b74"} Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.496623 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.496648 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.496667 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.535001 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.535593 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" podStartSLOduration=8.535574464 podStartE2EDuration="8.535574464s" podCreationTimestamp="2026-03-10 00:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:18:26.535392309 +0000 UTC m=+720.709099048" watchObservedRunningTime="2026-03-10 00:18:26.535574464 +0000 UTC m=+720.709281233" Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.536733 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:33 crc kubenswrapper[4994]: I0310 00:18:33.553672 4994 scope.go:117] "RemoveContainer" containerID="d5d956a023e0ec491c4113a78f5143267fd1a96d627be08eb78b078d22c69f89" Mar 10 00:18:33 crc kubenswrapper[4994]: E0310 00:18:33.554403 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mcxcb_openshift-multus(6dac87a5-07eb-488d-85fe-cb8848434ae5)\"" pod="openshift-multus/multus-mcxcb" podUID="6dac87a5-07eb-488d-85fe-cb8848434ae5" Mar 10 00:18:45 crc kubenswrapper[4994]: I0310 00:18:45.554543 4994 scope.go:117] "RemoveContainer" containerID="d5d956a023e0ec491c4113a78f5143267fd1a96d627be08eb78b078d22c69f89" Mar 10 00:18:46 crc kubenswrapper[4994]: I0310 00:18:46.639373 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/2.log" Mar 10 00:18:46 crc kubenswrapper[4994]: I0310 00:18:46.639792 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerStarted","Data":"2eeab3e0d6126ae7f064e1fd14955fd188a04b0e8b1213a0253d264716e50167"} Mar 10 00:18:49 crc kubenswrapper[4994]: I0310 00:18:49.315114 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:19:14 crc kubenswrapper[4994]: I0310 00:19:14.727926 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b884"] Mar 10 00:19:14 crc kubenswrapper[4994]: I0310 00:19:14.728991 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2b884" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="registry-server" containerID="cri-o://025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138" gracePeriod=30 Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.124173 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.253319 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-catalog-content\") pod \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.253483 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-utilities\") pod \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.253557 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqxvd\" (UniqueName: \"kubernetes.io/projected/33f42081-c92d-42cd-90b5-329a5ae6c2ad-kube-api-access-cqxvd\") pod \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.254995 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-utilities" (OuterVolumeSpecName: "utilities") pod "33f42081-c92d-42cd-90b5-329a5ae6c2ad" (UID: "33f42081-c92d-42cd-90b5-329a5ae6c2ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.263237 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f42081-c92d-42cd-90b5-329a5ae6c2ad-kube-api-access-cqxvd" (OuterVolumeSpecName: "kube-api-access-cqxvd") pod "33f42081-c92d-42cd-90b5-329a5ae6c2ad" (UID: "33f42081-c92d-42cd-90b5-329a5ae6c2ad"). InnerVolumeSpecName "kube-api-access-cqxvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.306252 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33f42081-c92d-42cd-90b5-329a5ae6c2ad" (UID: "33f42081-c92d-42cd-90b5-329a5ae6c2ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.355318 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.355381 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.355407 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqxvd\" (UniqueName: \"kubernetes.io/projected/33f42081-c92d-42cd-90b5-329a5ae6c2ad-kube-api-access-cqxvd\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.847070 4994 generic.go:334] "Generic (PLEG): container finished" podID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerID="025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138" exitCode=0 Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.847132 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b884" event={"ID":"33f42081-c92d-42cd-90b5-329a5ae6c2ad","Type":"ContainerDied","Data":"025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138"} Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.847180 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.847203 4994 scope.go:117] "RemoveContainer" containerID="025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.847186 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b884" event={"ID":"33f42081-c92d-42cd-90b5-329a5ae6c2ad","Type":"ContainerDied","Data":"2092117d1a588bddb0afa93402f75947e71f54b402d21d698eb1a68abf48a5d8"} Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.875594 4994 scope.go:117] "RemoveContainer" containerID="d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.892152 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b884"] Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.900209 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b884"] Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.910662 4994 scope.go:117] "RemoveContainer" containerID="9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.942701 4994 scope.go:117] "RemoveContainer" containerID="025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138" Mar 10 00:19:15 crc kubenswrapper[4994]: E0310 00:19:15.943454 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138\": container with ID starting with 025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138 not found: ID does not exist" containerID="025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.943501 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138"} err="failed to get container status \"025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138\": rpc error: code = NotFound desc = could not find container \"025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138\": container with ID starting with 025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138 not found: ID does not exist" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.943532 4994 scope.go:117] "RemoveContainer" containerID="d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a" Mar 10 00:19:15 crc kubenswrapper[4994]: E0310 00:19:15.944041 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a\": container with ID starting with d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a not found: ID does not exist" containerID="d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.944241 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a"} err="failed to get container status \"d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a\": rpc error: code = NotFound desc = could not find container \"d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a\": container with ID starting with d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a not found: ID does not exist" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.944401 4994 scope.go:117] "RemoveContainer" containerID="9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b" Mar 10 00:19:15 crc kubenswrapper[4994]: E0310 00:19:15.945146 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b\": container with ID starting with 9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b not found: ID does not exist" containerID="9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.945368 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b"} err="failed to get container status \"9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b\": rpc error: code = NotFound desc = could not find container \"9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b\": container with ID starting with 9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b not found: ID does not exist" Mar 10 00:19:16 crc kubenswrapper[4994]: I0310 00:19:16.567155 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" path="/var/lib/kubelet/pods/33f42081-c92d-42cd-90b5-329a5ae6c2ad/volumes" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.729662 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz"] Mar 10 00:19:18 crc kubenswrapper[4994]: E0310 00:19:18.729973 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="registry-server" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.729994 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="registry-server" Mar 10 00:19:18 crc kubenswrapper[4994]: E0310 00:19:18.730028 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="extract-content" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.730040 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="extract-content" Mar 10 00:19:18 crc kubenswrapper[4994]: E0310 00:19:18.730060 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="extract-utilities" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.730072 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="extract-utilities" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.730234 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="registry-server" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.731482 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.733947 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.740830 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz"] Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.815895 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.816057 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.816332 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnfms\" (UniqueName: \"kubernetes.io/projected/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-kube-api-access-wnfms\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.918006 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.918237 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnfms\" (UniqueName: \"kubernetes.io/projected/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-kube-api-access-wnfms\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.918361 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.918506 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.918761 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.950716 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnfms\" (UniqueName: \"kubernetes.io/projected/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-kube-api-access-wnfms\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:19 crc kubenswrapper[4994]: I0310 00:19:19.059835 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:19 crc kubenswrapper[4994]: I0310 00:19:19.372028 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz"] Mar 10 00:19:19 crc kubenswrapper[4994]: I0310 00:19:19.876478 4994 generic.go:334] "Generic (PLEG): container finished" podID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerID="639c9df1d48af8cdff7b0f3a99a12f6945e57821435616c558a6555fc233701d" exitCode=0 Mar 10 00:19:19 crc kubenswrapper[4994]: I0310 00:19:19.876550 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" event={"ID":"4dea22bc-f7b5-4722-b2c2-db96edfdcb96","Type":"ContainerDied","Data":"639c9df1d48af8cdff7b0f3a99a12f6945e57821435616c558a6555fc233701d"} Mar 10 00:19:19 crc kubenswrapper[4994]: I0310 00:19:19.876595 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" event={"ID":"4dea22bc-f7b5-4722-b2c2-db96edfdcb96","Type":"ContainerStarted","Data":"04229c26bff1a539582938656d7ebc1ccc0f230691b0261c551ec8981379d6fe"} Mar 10 00:19:21 crc kubenswrapper[4994]: I0310 00:19:21.897505 4994 generic.go:334] "Generic (PLEG): container finished" podID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerID="2a4772e5d3aa0dd231761f31dc60bd9c3f635f27324c3b160d231ace54e31d4e" exitCode=0 Mar 10 00:19:21 crc kubenswrapper[4994]: I0310 00:19:21.897634 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" event={"ID":"4dea22bc-f7b5-4722-b2c2-db96edfdcb96","Type":"ContainerDied","Data":"2a4772e5d3aa0dd231761f31dc60bd9c3f635f27324c3b160d231ace54e31d4e"} Mar 10 00:19:22 crc kubenswrapper[4994]: I0310 00:19:22.907272 4994 generic.go:334] "Generic (PLEG): container finished" podID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerID="ff7f02a03f07a7a049b25103440fb388fb7f59c7728ad51beca848d3a3f413c9" exitCode=0 Mar 10 00:19:22 crc kubenswrapper[4994]: I0310 00:19:22.907341 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" event={"ID":"4dea22bc-f7b5-4722-b2c2-db96edfdcb96","Type":"ContainerDied","Data":"ff7f02a03f07a7a049b25103440fb388fb7f59c7728ad51beca848d3a3f413c9"} Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.186359 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.287193 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-bundle\") pod \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.287272 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnfms\" (UniqueName: \"kubernetes.io/projected/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-kube-api-access-wnfms\") pod \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.287356 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-util\") pod \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.290992 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-bundle" (OuterVolumeSpecName: "bundle") pod "4dea22bc-f7b5-4722-b2c2-db96edfdcb96" (UID: "4dea22bc-f7b5-4722-b2c2-db96edfdcb96"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.293350 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-kube-api-access-wnfms" (OuterVolumeSpecName: "kube-api-access-wnfms") pod "4dea22bc-f7b5-4722-b2c2-db96edfdcb96" (UID: "4dea22bc-f7b5-4722-b2c2-db96edfdcb96"). InnerVolumeSpecName "kube-api-access-wnfms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.305533 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-util" (OuterVolumeSpecName: "util") pod "4dea22bc-f7b5-4722-b2c2-db96edfdcb96" (UID: "4dea22bc-f7b5-4722-b2c2-db96edfdcb96"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.421805 4994 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.421861 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnfms\" (UniqueName: \"kubernetes.io/projected/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-kube-api-access-wnfms\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.421908 4994 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.925402 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" event={"ID":"4dea22bc-f7b5-4722-b2c2-db96edfdcb96","Type":"ContainerDied","Data":"04229c26bff1a539582938656d7ebc1ccc0f230691b0261c551ec8981379d6fe"} Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.925466 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04229c26bff1a539582938656d7ebc1ccc0f230691b0261c551ec8981379d6fe" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.925486 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.140481 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw"] Mar 10 00:19:25 crc kubenswrapper[4994]: E0310 00:19:25.140994 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="pull" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.141062 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="pull" Mar 10 00:19:25 crc kubenswrapper[4994]: E0310 00:19:25.141088 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="extract" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.141107 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="extract" Mar 10 00:19:25 crc kubenswrapper[4994]: E0310 00:19:25.141138 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="util" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.141156 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="util" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.141362 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="extract" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.142768 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.153101 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.155931 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw"] Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.236292 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knbzz\" (UniqueName: \"kubernetes.io/projected/c792896d-13dd-4202-a2b7-62aac3396c78-kube-api-access-knbzz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.236643 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.236763 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.337535 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.337627 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.337690 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knbzz\" (UniqueName: \"kubernetes.io/projected/c792896d-13dd-4202-a2b7-62aac3396c78-kube-api-access-knbzz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.338685 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.338972 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.375327 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knbzz\" (UniqueName: \"kubernetes.io/projected/c792896d-13dd-4202-a2b7-62aac3396c78-kube-api-access-knbzz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.466784 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.825381 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw"] Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.937614 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" event={"ID":"c792896d-13dd-4202-a2b7-62aac3396c78","Type":"ContainerStarted","Data":"acb9dedd7f509cbae847d1e88acbbfd13d5081e980ec7de42a96bc80f913649f"} Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.941997 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk"] Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.945247 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.949438 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.949610 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twhrf\" (UniqueName: \"kubernetes.io/projected/b4b3e4dd-b86b-4442-9067-233a79e7942e-kube-api-access-twhrf\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.949665 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.954248 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk"] Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.051648 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twhrf\" (UniqueName: \"kubernetes.io/projected/b4b3e4dd-b86b-4442-9067-233a79e7942e-kube-api-access-twhrf\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.051732 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.051802 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.052325 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.052444 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.083789 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twhrf\" (UniqueName: \"kubernetes.io/projected/b4b3e4dd-b86b-4442-9067-233a79e7942e-kube-api-access-twhrf\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.261621 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.552320 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk"] Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.951022 4994 generic.go:334] "Generic (PLEG): container finished" podID="c792896d-13dd-4202-a2b7-62aac3396c78" containerID="2feceea9b45bb6f6540746677a00ccfd3f91ca6a64ff77d94b7eedf8e12a692a" exitCode=0 Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.951149 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" event={"ID":"c792896d-13dd-4202-a2b7-62aac3396c78","Type":"ContainerDied","Data":"2feceea9b45bb6f6540746677a00ccfd3f91ca6a64ff77d94b7eedf8e12a692a"} Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.955465 4994 generic.go:334] "Generic (PLEG): container finished" podID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerID="aa8beb4d0f2667b72d954c399b6fdc2c27721125c0776bf7190a24a0fef4c3e8" exitCode=0 Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.955514 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" event={"ID":"b4b3e4dd-b86b-4442-9067-233a79e7942e","Type":"ContainerDied","Data":"aa8beb4d0f2667b72d954c399b6fdc2c27721125c0776bf7190a24a0fef4c3e8"} Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.955758 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" event={"ID":"b4b3e4dd-b86b-4442-9067-233a79e7942e","Type":"ContainerStarted","Data":"e5fe920b3ed139c55793dea0f6a34bc2501658b48e5c546414dd0c79da5357c8"} Mar 10 00:19:27 crc kubenswrapper[4994]: I0310 00:19:27.966068 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" event={"ID":"b4b3e4dd-b86b-4442-9067-233a79e7942e","Type":"ContainerStarted","Data":"6f55af6ab8604a0908f97e3a3b321cd44d205ac4b0ca29e2aadfa0dde099c357"} Mar 10 00:19:28 crc kubenswrapper[4994]: I0310 00:19:28.972176 4994 generic.go:334] "Generic (PLEG): container finished" podID="c792896d-13dd-4202-a2b7-62aac3396c78" containerID="a5678d823f4db87f8b63e9616486bc663448482931f43875bd878700c19d1efc" exitCode=0 Mar 10 00:19:28 crc kubenswrapper[4994]: I0310 00:19:28.972226 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" event={"ID":"c792896d-13dd-4202-a2b7-62aac3396c78","Type":"ContainerDied","Data":"a5678d823f4db87f8b63e9616486bc663448482931f43875bd878700c19d1efc"} Mar 10 00:19:28 crc kubenswrapper[4994]: I0310 00:19:28.974603 4994 generic.go:334] "Generic (PLEG): container finished" podID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerID="6f55af6ab8604a0908f97e3a3b321cd44d205ac4b0ca29e2aadfa0dde099c357" exitCode=0 Mar 10 00:19:28 crc kubenswrapper[4994]: I0310 00:19:28.974642 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" event={"ID":"b4b3e4dd-b86b-4442-9067-233a79e7942e","Type":"ContainerDied","Data":"6f55af6ab8604a0908f97e3a3b321cd44d205ac4b0ca29e2aadfa0dde099c357"} Mar 10 00:19:29 crc kubenswrapper[4994]: I0310 00:19:29.982098 4994 generic.go:334] "Generic (PLEG): container finished" podID="c792896d-13dd-4202-a2b7-62aac3396c78" containerID="b3a3a146058c89dd59af81116cd5cc2193337b841ba1b804decdab6758ca6140" exitCode=0 Mar 10 00:19:29 crc kubenswrapper[4994]: I0310 00:19:29.982172 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" event={"ID":"c792896d-13dd-4202-a2b7-62aac3396c78","Type":"ContainerDied","Data":"b3a3a146058c89dd59af81116cd5cc2193337b841ba1b804decdab6758ca6140"} Mar 10 00:19:29 crc kubenswrapper[4994]: I0310 00:19:29.984645 4994 generic.go:334] "Generic (PLEG): container finished" podID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerID="742fa152f8a71a3355fd939e60aea365e7902312bbccb07ad4c3ff171ed266b2" exitCode=0 Mar 10 00:19:29 crc kubenswrapper[4994]: I0310 00:19:29.984683 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" event={"ID":"b4b3e4dd-b86b-4442-9067-233a79e7942e","Type":"ContainerDied","Data":"742fa152f8a71a3355fd939e60aea365e7902312bbccb07ad4c3ff171ed266b2"} Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.324815 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.335759 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.421316 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-bundle\") pod \"b4b3e4dd-b86b-4442-9067-233a79e7942e\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.421377 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-util\") pod \"c792896d-13dd-4202-a2b7-62aac3396c78\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.421404 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twhrf\" (UniqueName: \"kubernetes.io/projected/b4b3e4dd-b86b-4442-9067-233a79e7942e-kube-api-access-twhrf\") pod \"b4b3e4dd-b86b-4442-9067-233a79e7942e\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.421435 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knbzz\" (UniqueName: \"kubernetes.io/projected/c792896d-13dd-4202-a2b7-62aac3396c78-kube-api-access-knbzz\") pod \"c792896d-13dd-4202-a2b7-62aac3396c78\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.421507 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-bundle\") pod \"c792896d-13dd-4202-a2b7-62aac3396c78\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.421545 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-util\") pod \"b4b3e4dd-b86b-4442-9067-233a79e7942e\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.422294 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-bundle" (OuterVolumeSpecName: "bundle") pod "c792896d-13dd-4202-a2b7-62aac3396c78" (UID: "c792896d-13dd-4202-a2b7-62aac3396c78"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.422602 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-bundle" (OuterVolumeSpecName: "bundle") pod "b4b3e4dd-b86b-4442-9067-233a79e7942e" (UID: "b4b3e4dd-b86b-4442-9067-233a79e7942e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.443077 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c792896d-13dd-4202-a2b7-62aac3396c78-kube-api-access-knbzz" (OuterVolumeSpecName: "kube-api-access-knbzz") pod "c792896d-13dd-4202-a2b7-62aac3396c78" (UID: "c792896d-13dd-4202-a2b7-62aac3396c78"). InnerVolumeSpecName "kube-api-access-knbzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.443166 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b3e4dd-b86b-4442-9067-233a79e7942e-kube-api-access-twhrf" (OuterVolumeSpecName: "kube-api-access-twhrf") pod "b4b3e4dd-b86b-4442-9067-233a79e7942e" (UID: "b4b3e4dd-b86b-4442-9067-233a79e7942e"). InnerVolumeSpecName "kube-api-access-twhrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.456500 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-util" (OuterVolumeSpecName: "util") pod "b4b3e4dd-b86b-4442-9067-233a79e7942e" (UID: "b4b3e4dd-b86b-4442-9067-233a79e7942e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.522723 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knbzz\" (UniqueName: \"kubernetes.io/projected/c792896d-13dd-4202-a2b7-62aac3396c78-kube-api-access-knbzz\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.522938 4994 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.522995 4994 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.523079 4994 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.523132 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twhrf\" (UniqueName: \"kubernetes.io/projected/b4b3e4dd-b86b-4442-9067-233a79e7942e-kube-api-access-twhrf\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.644615 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-util" (OuterVolumeSpecName: "util") pod "c792896d-13dd-4202-a2b7-62aac3396c78" (UID: "c792896d-13dd-4202-a2b7-62aac3396c78"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.725801 4994 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.995610 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" event={"ID":"b4b3e4dd-b86b-4442-9067-233a79e7942e","Type":"ContainerDied","Data":"e5fe920b3ed139c55793dea0f6a34bc2501658b48e5c546414dd0c79da5357c8"} Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.995652 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5fe920b3ed139c55793dea0f6a34bc2501658b48e5c546414dd0c79da5357c8" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.995626 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.997456 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" event={"ID":"c792896d-13dd-4202-a2b7-62aac3396c78","Type":"ContainerDied","Data":"acb9dedd7f509cbae847d1e88acbbfd13d5081e980ec7de42a96bc80f913649f"} Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.997511 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acb9dedd7f509cbae847d1e88acbbfd13d5081e980ec7de42a96bc80f913649f" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.997518 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130482 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5"] Mar 10 00:19:34 crc kubenswrapper[4994]: E0310 00:19:34.130764 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="util" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130784 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="util" Mar 10 00:19:34 crc kubenswrapper[4994]: E0310 00:19:34.130806 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="util" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130817 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="util" Mar 10 00:19:34 crc kubenswrapper[4994]: E0310 00:19:34.130837 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="extract" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130849 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="extract" Mar 10 00:19:34 crc kubenswrapper[4994]: E0310 00:19:34.130896 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="extract" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130909 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="extract" Mar 10 00:19:34 crc kubenswrapper[4994]: E0310 00:19:34.130927 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="pull" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130937 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="pull" Mar 10 00:19:34 crc kubenswrapper[4994]: E0310 00:19:34.130949 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="pull" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130960 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="pull" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.131107 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="extract" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.131132 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="extract" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.132250 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.133931 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.145916 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.162534 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd5fn\" (UniqueName: \"kubernetes.io/projected/1b75d3a9-a107-4c28-afc2-7eb7e1357113-kube-api-access-bd5fn\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.162600 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.162737 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.264715 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd5fn\" (UniqueName: \"kubernetes.io/projected/1b75d3a9-a107-4c28-afc2-7eb7e1357113-kube-api-access-bd5fn\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.264846 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.264976 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.265953 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.265961 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.286614 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd5fn\" (UniqueName: \"kubernetes.io/projected/1b75d3a9-a107-4c28-afc2-7eb7e1357113-kube-api-access-bd5fn\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.454409 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.615809 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.616763 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.618784 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.619020 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.629033 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rgfvq" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.634909 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.670995 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvpv\" (UniqueName: \"kubernetes.io/projected/13e52713-fbfe-43ba-ae51-b13a060d8a05-kube-api-access-vtvpv\") pod \"obo-prometheus-operator-68bc856cb9-fnj29\" (UID: \"13e52713-fbfe-43ba-ae51-b13a060d8a05\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.696020 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.741959 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.742853 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.745918 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.750165 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qqfzs" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.759418 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.760097 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.768435 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.771813 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08b7eb36-ad76-4d9a-9fe9-f37febcdfdab-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw\" (UID: \"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.771880 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08b7eb36-ad76-4d9a-9fe9-f37febcdfdab-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw\" (UID: \"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.771919 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvpv\" (UniqueName: \"kubernetes.io/projected/13e52713-fbfe-43ba-ae51-b13a060d8a05-kube-api-access-vtvpv\") pod \"obo-prometheus-operator-68bc856cb9-fnj29\" (UID: \"13e52713-fbfe-43ba-ae51-b13a060d8a05\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.775100 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.805654 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvpv\" (UniqueName: \"kubernetes.io/projected/13e52713-fbfe-43ba-ae51-b13a060d8a05-kube-api-access-vtvpv\") pod \"obo-prometheus-operator-68bc856cb9-fnj29\" (UID: \"13e52713-fbfe-43ba-ae51-b13a060d8a05\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.876394 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22d07ce7-cdcc-4804-8127-a4f3a9d1685f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r\" (UID: \"22d07ce7-cdcc-4804-8127-a4f3a9d1685f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.876695 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22d07ce7-cdcc-4804-8127-a4f3a9d1685f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r\" (UID: \"22d07ce7-cdcc-4804-8127-a4f3a9d1685f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.876740 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08b7eb36-ad76-4d9a-9fe9-f37febcdfdab-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw\" (UID: \"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.876777 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08b7eb36-ad76-4d9a-9fe9-f37febcdfdab-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw\" (UID: \"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.881315 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08b7eb36-ad76-4d9a-9fe9-f37febcdfdab-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw\" (UID: \"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.887578 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08b7eb36-ad76-4d9a-9fe9-f37febcdfdab-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw\" (UID: \"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.942269 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.943926 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2jk2w"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.944594 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.946904 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-922cp" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.948529 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.958283 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2jk2w"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.981406 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2jk2w\" (UID: \"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e\") " pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.981461 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t82xb\" (UniqueName: \"kubernetes.io/projected/9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e-kube-api-access-t82xb\") pod \"observability-operator-59bdc8b94-2jk2w\" (UID: \"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e\") " pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.981493 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22d07ce7-cdcc-4804-8127-a4f3a9d1685f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r\" (UID: \"22d07ce7-cdcc-4804-8127-a4f3a9d1685f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.981511 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22d07ce7-cdcc-4804-8127-a4f3a9d1685f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r\" (UID: \"22d07ce7-cdcc-4804-8127-a4f3a9d1685f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.984493 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22d07ce7-cdcc-4804-8127-a4f3a9d1685f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r\" (UID: \"22d07ce7-cdcc-4804-8127-a4f3a9d1685f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.988135 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22d07ce7-cdcc-4804-8127-a4f3a9d1685f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r\" (UID: \"22d07ce7-cdcc-4804-8127-a4f3a9d1685f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.017840 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" event={"ID":"1b75d3a9-a107-4c28-afc2-7eb7e1357113","Type":"ContainerStarted","Data":"a88bdcb4e90901bbacfd8ee24aa0a9a374eebc2ed6f9998f79f9db1119b33e20"} Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.017890 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" event={"ID":"1b75d3a9-a107-4c28-afc2-7eb7e1357113","Type":"ContainerStarted","Data":"d73cec1d6e71b20e6e921ec34546a853007d278d08029837454b216700528bdf"} Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.082468 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2jk2w\" (UID: \"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e\") " pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.082532 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t82xb\" (UniqueName: \"kubernetes.io/projected/9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e-kube-api-access-t82xb\") pod \"observability-operator-59bdc8b94-2jk2w\" (UID: \"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e\") " pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.086226 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2jk2w\" (UID: \"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e\") " pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.086301 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-gxbhj"] Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.086962 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.098116 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-gxbhj"] Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.105623 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t82xb\" (UniqueName: \"kubernetes.io/projected/9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e-kube-api-access-t82xb\") pod \"observability-operator-59bdc8b94-2jk2w\" (UID: \"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e\") " pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.107045 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.109183 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6wpx6" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.133192 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.183533 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/65c6820f-4375-4de8-bcdf-0f0e2c4bcd87-openshift-service-ca\") pod \"perses-operator-5bf474d74f-gxbhj\" (UID: \"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87\") " pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.183842 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cvd8\" (UniqueName: \"kubernetes.io/projected/65c6820f-4375-4de8-bcdf-0f0e2c4bcd87-kube-api-access-6cvd8\") pod \"perses-operator-5bf474d74f-gxbhj\" (UID: \"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87\") " pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.257218 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.285244 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cvd8\" (UniqueName: \"kubernetes.io/projected/65c6820f-4375-4de8-bcdf-0f0e2c4bcd87-kube-api-access-6cvd8\") pod \"perses-operator-5bf474d74f-gxbhj\" (UID: \"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87\") " pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.285358 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/65c6820f-4375-4de8-bcdf-0f0e2c4bcd87-openshift-service-ca\") pod \"perses-operator-5bf474d74f-gxbhj\" (UID: \"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87\") " pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.287028 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/65c6820f-4375-4de8-bcdf-0f0e2c4bcd87-openshift-service-ca\") pod \"perses-operator-5bf474d74f-gxbhj\" (UID: \"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87\") " pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.306056 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cvd8\" (UniqueName: \"kubernetes.io/projected/65c6820f-4375-4de8-bcdf-0f0e2c4bcd87-kube-api-access-6cvd8\") pod \"perses-operator-5bf474d74f-gxbhj\" (UID: \"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87\") " pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.366443 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r"] Mar 10 00:19:35 crc kubenswrapper[4994]: W0310 00:19:35.383193 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22d07ce7_cdcc_4804_8127_a4f3a9d1685f.slice/crio-2816254c96065bbacbf1f79a20d126caa641e7b7d6c396eecc2b6928258b8f99 WatchSource:0}: Error finding container 2816254c96065bbacbf1f79a20d126caa641e7b7d6c396eecc2b6928258b8f99: Status 404 returned error can't find the container with id 2816254c96065bbacbf1f79a20d126caa641e7b7d6c396eecc2b6928258b8f99 Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.408528 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29"] Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.421333 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw"] Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.432459 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.531596 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2jk2w"] Mar 10 00:19:35 crc kubenswrapper[4994]: W0310 00:19:35.538156 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ec2ef1a_309f_4d22_b9e7_c6536fb8a46e.slice/crio-ab41d7e04137fd8dc782ceb6a4de4dd97f27ec01ea6be5948504a7d82595a003 WatchSource:0}: Error finding container ab41d7e04137fd8dc782ceb6a4de4dd97f27ec01ea6be5948504a7d82595a003: Status 404 returned error can't find the container with id ab41d7e04137fd8dc782ceb6a4de4dd97f27ec01ea6be5948504a7d82595a003 Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.622558 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-gxbhj"] Mar 10 00:19:35 crc kubenswrapper[4994]: W0310 00:19:35.629251 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65c6820f_4375_4de8_bcdf_0f0e2c4bcd87.slice/crio-3d6413977120c2da73c7d14d5d9dd5f31fb6a1df482e0cb53b4ed4214d80f951 WatchSource:0}: Error finding container 3d6413977120c2da73c7d14d5d9dd5f31fb6a1df482e0cb53b4ed4214d80f951: Status 404 returned error can't find the container with id 3d6413977120c2da73c7d14d5d9dd5f31fb6a1df482e0cb53b4ed4214d80f951 Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.026027 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" event={"ID":"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e","Type":"ContainerStarted","Data":"ab41d7e04137fd8dc782ceb6a4de4dd97f27ec01ea6be5948504a7d82595a003"} Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.027767 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" event={"ID":"13e52713-fbfe-43ba-ae51-b13a060d8a05","Type":"ContainerStarted","Data":"988bbc205a1350baa077393b997ca568bde16cd9b7c7a40885c2a283eb21d3c4"} Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.029455 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" event={"ID":"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab","Type":"ContainerStarted","Data":"d1577796553acf114f7080bb7c9e207c3708bc27c9e6f9f5a3983158ee3a7833"} Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.030585 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" event={"ID":"22d07ce7-cdcc-4804-8127-a4f3a9d1685f","Type":"ContainerStarted","Data":"2816254c96065bbacbf1f79a20d126caa641e7b7d6c396eecc2b6928258b8f99"} Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.031853 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" event={"ID":"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87","Type":"ContainerStarted","Data":"3d6413977120c2da73c7d14d5d9dd5f31fb6a1df482e0cb53b4ed4214d80f951"} Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.033495 4994 generic.go:334] "Generic (PLEG): container finished" podID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerID="a88bdcb4e90901bbacfd8ee24aa0a9a374eebc2ed6f9998f79f9db1119b33e20" exitCode=0 Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.033539 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" event={"ID":"1b75d3a9-a107-4c28-afc2-7eb7e1357113","Type":"ContainerDied","Data":"a88bdcb4e90901bbacfd8ee24aa0a9a374eebc2ed6f9998f79f9db1119b33e20"} Mar 10 00:19:41 crc kubenswrapper[4994]: I0310 00:19:41.968839 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-m4pqx"] Mar 10 00:19:41 crc kubenswrapper[4994]: I0310 00:19:41.970152 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" Mar 10 00:19:41 crc kubenswrapper[4994]: I0310 00:19:41.972608 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-dppkh" Mar 10 00:19:41 crc kubenswrapper[4994]: I0310 00:19:41.972718 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 10 00:19:41 crc kubenswrapper[4994]: I0310 00:19:41.980232 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 10 00:19:41 crc kubenswrapper[4994]: I0310 00:19:41.986307 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-m4pqx"] Mar 10 00:19:42 crc kubenswrapper[4994]: I0310 00:19:42.069582 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb44d\" (UniqueName: \"kubernetes.io/projected/e54a8fd6-9ed8-42fd-bf63-b45930f9c54e-kube-api-access-fb44d\") pod \"interconnect-operator-5bb49f789d-m4pqx\" (UID: \"e54a8fd6-9ed8-42fd-bf63-b45930f9c54e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" Mar 10 00:19:42 crc kubenswrapper[4994]: I0310 00:19:42.170774 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb44d\" (UniqueName: \"kubernetes.io/projected/e54a8fd6-9ed8-42fd-bf63-b45930f9c54e-kube-api-access-fb44d\") pod \"interconnect-operator-5bb49f789d-m4pqx\" (UID: \"e54a8fd6-9ed8-42fd-bf63-b45930f9c54e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" Mar 10 00:19:42 crc kubenswrapper[4994]: I0310 00:19:42.194222 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb44d\" (UniqueName: \"kubernetes.io/projected/e54a8fd6-9ed8-42fd-bf63-b45930f9c54e-kube-api-access-fb44d\") pod \"interconnect-operator-5bb49f789d-m4pqx\" (UID: \"e54a8fd6-9ed8-42fd-bf63-b45930f9c54e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" Mar 10 00:19:42 crc kubenswrapper[4994]: I0310 00:19:42.287672 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.062902 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-67966b6766-wzg6h"] Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.064473 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.066338 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.066805 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-4h5bj" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.122767 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-67966b6766-wzg6h"] Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.210780 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-webhook-cert\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.210827 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-apiservice-cert\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.210846 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzntx\" (UniqueName: \"kubernetes.io/projected/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-kube-api-access-xzntx\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.312560 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-webhook-cert\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.312618 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-apiservice-cert\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.312643 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzntx\" (UniqueName: \"kubernetes.io/projected/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-kube-api-access-xzntx\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.330791 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-webhook-cert\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.331516 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzntx\" (UniqueName: \"kubernetes.io/projected/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-kube-api-access-xzntx\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.333485 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-apiservice-cert\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.378682 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:50 crc kubenswrapper[4994]: E0310 00:19:50.109711 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8" Mar 10 00:19:50 crc kubenswrapper[4994]: E0310 00:19:50.110423 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6cvd8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5bf474d74f-gxbhj_openshift-operators(65c6820f-4375-4de8-bcdf-0f0e2c4bcd87): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:19:50 crc kubenswrapper[4994]: E0310 00:19:50.111935 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" podUID="65c6820f-4375-4de8-bcdf-0f0e2c4bcd87" Mar 10 00:19:50 crc kubenswrapper[4994]: E0310 00:19:50.152976 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Mar 10 00:19:50 crc kubenswrapper[4994]: E0310 00:19:50.153145 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r_openshift-operators(22d07ce7-cdcc-4804-8127-a4f3a9d1685f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:19:50 crc kubenswrapper[4994]: E0310 00:19:50.154329 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" podUID="22d07ce7-cdcc-4804-8127-a4f3a9d1685f" Mar 10 00:19:50 crc kubenswrapper[4994]: I0310 00:19:50.359691 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-m4pqx"] Mar 10 00:19:50 crc kubenswrapper[4994]: I0310 00:19:50.405672 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-67966b6766-wzg6h"] Mar 10 00:19:50 crc kubenswrapper[4994]: W0310 00:19:50.410006 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode02a24cb_8f25_4fd1_9f32_aa8ff3116662.slice/crio-5a5f406586dda641c509124dadd094637b3c5dba85bd13d1dd4468f6f0a6b4f8 WatchSource:0}: Error finding container 5a5f406586dda641c509124dadd094637b3c5dba85bd13d1dd4468f6f0a6b4f8: Status 404 returned error can't find the container with id 5a5f406586dda641c509124dadd094637b3c5dba85bd13d1dd4468f6f0a6b4f8 Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.125596 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" event={"ID":"e54a8fd6-9ed8-42fd-bf63-b45930f9c54e","Type":"ContainerStarted","Data":"3ad6c46f080b07e26711b4b60b1748b1ae30a55efd962a95a36c10006db6f04a"} Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.128371 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-67966b6766-wzg6h" event={"ID":"e02a24cb-8f25-4fd1-9f32-aa8ff3116662","Type":"ContainerStarted","Data":"5a5f406586dda641c509124dadd094637b3c5dba85bd13d1dd4468f6f0a6b4f8"} Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.129818 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" event={"ID":"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab","Type":"ContainerStarted","Data":"26d19c6cb24f1a9a364cca37baf5c7e6bb8b73e0457d6853ca6f0fb944b29bdf"} Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.131318 4994 generic.go:334] "Generic (PLEG): container finished" podID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerID="2fab9f3f289f9b453d5428a5dfb3bb7f1306fe48c43ade59ef20e4832dc1272b" exitCode=0 Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.131375 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" event={"ID":"1b75d3a9-a107-4c28-afc2-7eb7e1357113","Type":"ContainerDied","Data":"2fab9f3f289f9b453d5428a5dfb3bb7f1306fe48c43ade59ef20e4832dc1272b"} Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.132859 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" event={"ID":"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e","Type":"ContainerStarted","Data":"e339bb8b7cd9ef341b3fb2ddd08fb98ba5e3ce775ddf16513422bc6dc29ec284"} Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.133101 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.136759 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" event={"ID":"13e52713-fbfe-43ba-ae51-b13a060d8a05","Type":"ContainerStarted","Data":"253c3e6dc952519cd8f613f2db84f9d830ae1d0b6975d697ace39024e527f56e"} Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.153179 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" podStartSLOduration=2.42793204 podStartE2EDuration="17.153157882s" podCreationTimestamp="2026-03-10 00:19:34 +0000 UTC" firstStartedPulling="2026-03-10 00:19:35.446258806 +0000 UTC m=+789.619965555" lastFinishedPulling="2026-03-10 00:19:50.171484638 +0000 UTC m=+804.345191397" observedRunningTime="2026-03-10 00:19:51.145702661 +0000 UTC m=+805.319409420" watchObservedRunningTime="2026-03-10 00:19:51.153157882 +0000 UTC m=+805.326864631" Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.156289 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:51 crc kubenswrapper[4994]: E0310 00:19:51.157864 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8\\\"\"" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" podUID="65c6820f-4375-4de8-bcdf-0f0e2c4bcd87" Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.200542 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" podStartSLOduration=2.5606218050000003 podStartE2EDuration="17.200529244s" podCreationTimestamp="2026-03-10 00:19:34 +0000 UTC" firstStartedPulling="2026-03-10 00:19:35.539995125 +0000 UTC m=+789.713701874" lastFinishedPulling="2026-03-10 00:19:50.179902564 +0000 UTC m=+804.353609313" observedRunningTime="2026-03-10 00:19:51.198516081 +0000 UTC m=+805.372222860" watchObservedRunningTime="2026-03-10 00:19:51.200529244 +0000 UTC m=+805.374235993" Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.272586 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" podStartSLOduration=2.515916323 podStartE2EDuration="17.27256674s" podCreationTimestamp="2026-03-10 00:19:34 +0000 UTC" firstStartedPulling="2026-03-10 00:19:35.412856018 +0000 UTC m=+789.586562767" lastFinishedPulling="2026-03-10 00:19:50.169506425 +0000 UTC m=+804.343213184" observedRunningTime="2026-03-10 00:19:51.269982441 +0000 UTC m=+805.443689190" watchObservedRunningTime="2026-03-10 00:19:51.27256674 +0000 UTC m=+805.446273489" Mar 10 00:19:52 crc kubenswrapper[4994]: I0310 00:19:52.148771 4994 generic.go:334] "Generic (PLEG): container finished" podID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerID="360b49d34a22c59827c7cea4c2fc6406ccd0c8bf54363f66c38e3ae283b8a608" exitCode=0 Mar 10 00:19:52 crc kubenswrapper[4994]: I0310 00:19:52.148824 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" event={"ID":"1b75d3a9-a107-4c28-afc2-7eb7e1357113","Type":"ContainerDied","Data":"360b49d34a22c59827c7cea4c2fc6406ccd0c8bf54363f66c38e3ae283b8a608"} Mar 10 00:19:52 crc kubenswrapper[4994]: I0310 00:19:52.153321 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" event={"ID":"22d07ce7-cdcc-4804-8127-a4f3a9d1685f","Type":"ContainerStarted","Data":"30f1685ca52b28c6bfccb579464d4018c94422b4f82ed664aaadc8cd2d6056a4"} Mar 10 00:19:52 crc kubenswrapper[4994]: I0310 00:19:52.201006 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" podStartSLOduration=-9223372018.653791 podStartE2EDuration="18.200984863s" podCreationTimestamp="2026-03-10 00:19:34 +0000 UTC" firstStartedPulling="2026-03-10 00:19:35.390396094 +0000 UTC m=+789.564102843" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:19:52.196915293 +0000 UTC m=+806.370622032" watchObservedRunningTime="2026-03-10 00:19:52.200984863 +0000 UTC m=+806.374691612" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.640221 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.746265 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd5fn\" (UniqueName: \"kubernetes.io/projected/1b75d3a9-a107-4c28-afc2-7eb7e1357113-kube-api-access-bd5fn\") pod \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.746324 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-util\") pod \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.746384 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-bundle\") pod \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.747376 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-bundle" (OuterVolumeSpecName: "bundle") pod "1b75d3a9-a107-4c28-afc2-7eb7e1357113" (UID: "1b75d3a9-a107-4c28-afc2-7eb7e1357113"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.754076 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b75d3a9-a107-4c28-afc2-7eb7e1357113-kube-api-access-bd5fn" (OuterVolumeSpecName: "kube-api-access-bd5fn") pod "1b75d3a9-a107-4c28-afc2-7eb7e1357113" (UID: "1b75d3a9-a107-4c28-afc2-7eb7e1357113"). InnerVolumeSpecName "kube-api-access-bd5fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.757048 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-util" (OuterVolumeSpecName: "util") pod "1b75d3a9-a107-4c28-afc2-7eb7e1357113" (UID: "1b75d3a9-a107-4c28-afc2-7eb7e1357113"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.848227 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd5fn\" (UniqueName: \"kubernetes.io/projected/1b75d3a9-a107-4c28-afc2-7eb7e1357113-kube-api-access-bd5fn\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.848491 4994 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.848501 4994 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.188387 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" event={"ID":"1b75d3a9-a107-4c28-afc2-7eb7e1357113","Type":"ContainerDied","Data":"d73cec1d6e71b20e6e921ec34546a853007d278d08029837454b216700528bdf"} Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.188462 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d73cec1d6e71b20e6e921ec34546a853007d278d08029837454b216700528bdf" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.188558 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.190505 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-67966b6766-wzg6h" event={"ID":"e02a24cb-8f25-4fd1-9f32-aa8ff3116662","Type":"ContainerStarted","Data":"e8d5c3c3a60b753e86a20b2e01f2d523b1afc35c5ea6965bcf7699e80ba82b79"} Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.211835 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-67966b6766-wzg6h" podStartSLOduration=5.976363419 podStartE2EDuration="10.211816791s" podCreationTimestamp="2026-03-10 00:19:45 +0000 UTC" firstStartedPulling="2026-03-10 00:19:50.412602978 +0000 UTC m=+804.586309727" lastFinishedPulling="2026-03-10 00:19:54.64805635 +0000 UTC m=+808.821763099" observedRunningTime="2026-03-10 00:19:55.208476591 +0000 UTC m=+809.382183350" watchObservedRunningTime="2026-03-10 00:19:55.211816791 +0000 UTC m=+809.385523540" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.798750 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:19:55 crc kubenswrapper[4994]: E0310 00:19:55.799633 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="extract" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.799731 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="extract" Mar 10 00:19:55 crc kubenswrapper[4994]: E0310 00:19:55.799806 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="pull" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.799856 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="pull" Mar 10 00:19:55 crc kubenswrapper[4994]: E0310 00:19:55.799927 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="util" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.799976 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="util" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.800133 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="extract" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.800928 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.803583 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.803853 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.803961 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.804286 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-cw6vm" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.804330 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.804356 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.804442 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.804497 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.804531 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.858696 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.858961 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859046 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859205 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859247 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859273 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859300 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859326 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859352 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859393 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859416 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859438 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859457 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859487 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859513 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.923879 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.960413 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.960693 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.960788 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.960895 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961002 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961134 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961229 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961302 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961365 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961435 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961504 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961569 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961617 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961705 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961810 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961921 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.962397 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.964320 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.964636 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.964902 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.966240 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961841 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.966471 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.966471 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.970376 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.971037 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.971670 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.972250 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.974282 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.978611 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:56 crc kubenswrapper[4994]: I0310 00:19:56.116711 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:59 crc kubenswrapper[4994]: I0310 00:19:59.667260 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:19:59 crc kubenswrapper[4994]: W0310 00:19:59.669284 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e5e8d5b_ba04_461d_b7d0_98d90dd79fd7.slice/crio-296aa5ba18bedbb49c263d81e2e1853355ecf06134ea220bee7480951ed28306 WatchSource:0}: Error finding container 296aa5ba18bedbb49c263d81e2e1853355ecf06134ea220bee7480951ed28306: Status 404 returned error can't find the container with id 296aa5ba18bedbb49c263d81e2e1853355ecf06134ea220bee7480951ed28306 Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.130965 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551700-9pnx5"] Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.131897 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.134584 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.134742 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.135110 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.139653 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551700-9pnx5"] Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.214445 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkmwt\" (UniqueName: \"kubernetes.io/projected/e07bbe9c-f27a-4256-82ba-3adc771e2ebd-kube-api-access-gkmwt\") pod \"auto-csr-approver-29551700-9pnx5\" (UID: \"e07bbe9c-f27a-4256-82ba-3adc771e2ebd\") " pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.224117 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" event={"ID":"e54a8fd6-9ed8-42fd-bf63-b45930f9c54e","Type":"ContainerStarted","Data":"3c131e6c09642c6c06c93905255f6d907a55064ea977ae14f13372801517a5e4"} Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.225132 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7","Type":"ContainerStarted","Data":"296aa5ba18bedbb49c263d81e2e1853355ecf06134ea220bee7480951ed28306"} Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.240131 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" podStartSLOduration=10.065170466 podStartE2EDuration="19.2401095s" podCreationTimestamp="2026-03-10 00:19:41 +0000 UTC" firstStartedPulling="2026-03-10 00:19:50.377990388 +0000 UTC m=+804.551697137" lastFinishedPulling="2026-03-10 00:19:59.552929412 +0000 UTC m=+813.726636171" observedRunningTime="2026-03-10 00:20:00.236199156 +0000 UTC m=+814.409905905" watchObservedRunningTime="2026-03-10 00:20:00.2401095 +0000 UTC m=+814.413816249" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.316198 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkmwt\" (UniqueName: \"kubernetes.io/projected/e07bbe9c-f27a-4256-82ba-3adc771e2ebd-kube-api-access-gkmwt\") pod \"auto-csr-approver-29551700-9pnx5\" (UID: \"e07bbe9c-f27a-4256-82ba-3adc771e2ebd\") " pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.343311 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkmwt\" (UniqueName: \"kubernetes.io/projected/e07bbe9c-f27a-4256-82ba-3adc771e2ebd-kube-api-access-gkmwt\") pod \"auto-csr-approver-29551700-9pnx5\" (UID: \"e07bbe9c-f27a-4256-82ba-3adc771e2ebd\") " pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.457535 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.707599 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551700-9pnx5"] Mar 10 00:20:01 crc kubenswrapper[4994]: I0310 00:20:01.234098 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" event={"ID":"e07bbe9c-f27a-4256-82ba-3adc771e2ebd","Type":"ContainerStarted","Data":"010910f36065ef6295367084341785cffbef39a2a3b7c8e3d664f1822e8f079a"} Mar 10 00:20:05 crc kubenswrapper[4994]: I0310 00:20:05.264058 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" event={"ID":"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87","Type":"ContainerStarted","Data":"93abdcc9799949b43a61376d54507d7a8adbb2ce25ff7af94f044327956627a5"} Mar 10 00:20:05 crc kubenswrapper[4994]: I0310 00:20:05.264582 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:20:05 crc kubenswrapper[4994]: I0310 00:20:05.265631 4994 generic.go:334] "Generic (PLEG): container finished" podID="e07bbe9c-f27a-4256-82ba-3adc771e2ebd" containerID="7f305cfa821f31b484905a2d361cdfc46f777a5744001baccc6a559f30eb2409" exitCode=0 Mar 10 00:20:05 crc kubenswrapper[4994]: I0310 00:20:05.265675 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" event={"ID":"e07bbe9c-f27a-4256-82ba-3adc771e2ebd","Type":"ContainerDied","Data":"7f305cfa821f31b484905a2d361cdfc46f777a5744001baccc6a559f30eb2409"} Mar 10 00:20:05 crc kubenswrapper[4994]: I0310 00:20:05.288239 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" podStartSLOduration=1.385109356 podStartE2EDuration="30.288222222s" podCreationTimestamp="2026-03-10 00:19:35 +0000 UTC" firstStartedPulling="2026-03-10 00:19:35.632182612 +0000 UTC m=+789.805889361" lastFinishedPulling="2026-03-10 00:20:04.535295478 +0000 UTC m=+818.709002227" observedRunningTime="2026-03-10 00:20:05.28402493 +0000 UTC m=+819.457731699" watchObservedRunningTime="2026-03-10 00:20:05.288222222 +0000 UTC m=+819.461928981" Mar 10 00:20:06 crc kubenswrapper[4994]: I0310 00:20:06.651816 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:06 crc kubenswrapper[4994]: I0310 00:20:06.709252 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkmwt\" (UniqueName: \"kubernetes.io/projected/e07bbe9c-f27a-4256-82ba-3adc771e2ebd-kube-api-access-gkmwt\") pod \"e07bbe9c-f27a-4256-82ba-3adc771e2ebd\" (UID: \"e07bbe9c-f27a-4256-82ba-3adc771e2ebd\") " Mar 10 00:20:06 crc kubenswrapper[4994]: I0310 00:20:06.724096 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07bbe9c-f27a-4256-82ba-3adc771e2ebd-kube-api-access-gkmwt" (OuterVolumeSpecName: "kube-api-access-gkmwt") pod "e07bbe9c-f27a-4256-82ba-3adc771e2ebd" (UID: "e07bbe9c-f27a-4256-82ba-3adc771e2ebd"). InnerVolumeSpecName "kube-api-access-gkmwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:20:06 crc kubenswrapper[4994]: I0310 00:20:06.811491 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkmwt\" (UniqueName: \"kubernetes.io/projected/e07bbe9c-f27a-4256-82ba-3adc771e2ebd-kube-api-access-gkmwt\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:07 crc kubenswrapper[4994]: I0310 00:20:07.278793 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" event={"ID":"e07bbe9c-f27a-4256-82ba-3adc771e2ebd","Type":"ContainerDied","Data":"010910f36065ef6295367084341785cffbef39a2a3b7c8e3d664f1822e8f079a"} Mar 10 00:20:07 crc kubenswrapper[4994]: I0310 00:20:07.278831 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="010910f36065ef6295367084341785cffbef39a2a3b7c8e3d664f1822e8f079a" Mar 10 00:20:07 crc kubenswrapper[4994]: I0310 00:20:07.278852 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:07 crc kubenswrapper[4994]: I0310 00:20:07.720199 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551694-pngfv"] Mar 10 00:20:07 crc kubenswrapper[4994]: I0310 00:20:07.724259 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551694-pngfv"] Mar 10 00:20:08 crc kubenswrapper[4994]: I0310 00:20:08.564568 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e91ae1c5-3f03-4439-b579-b828884a1b58" path="/var/lib/kubelet/pods/e91ae1c5-3f03-4439-b579-b828884a1b58/volumes" Mar 10 00:20:09 crc kubenswrapper[4994]: I0310 00:20:09.837912 4994 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 00:20:11 crc kubenswrapper[4994]: I0310 00:20:11.836048 4994 scope.go:117] "RemoveContainer" containerID="e7fca198849ed5918e32d90439680c268c9d2e25eff70cc1e2ce92503bc67d85" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.722932 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx"] Mar 10 00:20:12 crc kubenswrapper[4994]: E0310 00:20:12.723536 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07bbe9c-f27a-4256-82ba-3adc771e2ebd" containerName="oc" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.723552 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07bbe9c-f27a-4256-82ba-3adc771e2ebd" containerName="oc" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.723701 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="e07bbe9c-f27a-4256-82ba-3adc771e2ebd" containerName="oc" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.724231 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.725752 4994 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-5rlc4" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.726358 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.726527 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.750121 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx"] Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.802754 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcl77\" (UniqueName: \"kubernetes.io/projected/47cdbf6c-4d30-40c5-8af9-6685cd711b7a-kube-api-access-tcl77\") pod \"cert-manager-operator-controller-manager-5586865c96-jzlcx\" (UID: \"47cdbf6c-4d30-40c5-8af9-6685cd711b7a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.802983 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47cdbf6c-4d30-40c5-8af9-6685cd711b7a-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-jzlcx\" (UID: \"47cdbf6c-4d30-40c5-8af9-6685cd711b7a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.904173 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcl77\" (UniqueName: \"kubernetes.io/projected/47cdbf6c-4d30-40c5-8af9-6685cd711b7a-kube-api-access-tcl77\") pod \"cert-manager-operator-controller-manager-5586865c96-jzlcx\" (UID: \"47cdbf6c-4d30-40c5-8af9-6685cd711b7a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.904277 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47cdbf6c-4d30-40c5-8af9-6685cd711b7a-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-jzlcx\" (UID: \"47cdbf6c-4d30-40c5-8af9-6685cd711b7a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.904729 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47cdbf6c-4d30-40c5-8af9-6685cd711b7a-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-jzlcx\" (UID: \"47cdbf6c-4d30-40c5-8af9-6685cd711b7a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.927977 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcl77\" (UniqueName: \"kubernetes.io/projected/47cdbf6c-4d30-40c5-8af9-6685cd711b7a-kube-api-access-tcl77\") pod \"cert-manager-operator-controller-manager-5586865c96-jzlcx\" (UID: \"47cdbf6c-4d30-40c5-8af9-6685cd711b7a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:13 crc kubenswrapper[4994]: I0310 00:20:13.049083 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:15 crc kubenswrapper[4994]: I0310 00:20:15.436071 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:20:20 crc kubenswrapper[4994]: I0310 00:20:20.195593 4994 scope.go:117] "RemoveContainer" containerID="ddb1ff554509065a0634194f412b0e90319b501a3735bf3cda900f518d12f147" Mar 10 00:20:20 crc kubenswrapper[4994]: I0310 00:20:20.762798 4994 scope.go:117] "RemoveContainer" containerID="ef1f80910f9e65f34790675bdb343fd6ffef0cbe9f22353df047c95ba63843ec" Mar 10 00:20:21 crc kubenswrapper[4994]: E0310 00:20:21.377691 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Mar 10 00:20:21 crc kubenswrapper[4994]: E0310 00:20:21.378285 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 00:20:21 crc kubenswrapper[4994]: E0310 00:20:21.379515 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" Mar 10 00:20:21 crc kubenswrapper[4994]: I0310 00:20:21.586197 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx"] Mar 10 00:20:21 crc kubenswrapper[4994]: W0310 00:20:21.598527 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47cdbf6c_4d30_40c5_8af9_6685cd711b7a.slice/crio-930ca6ed0337859ec704828a7f5c8d2fc8b22ef59722422e4a262eeb62fb2134 WatchSource:0}: Error finding container 930ca6ed0337859ec704828a7f5c8d2fc8b22ef59722422e4a262eeb62fb2134: Status 404 returned error can't find the container with id 930ca6ed0337859ec704828a7f5c8d2fc8b22ef59722422e4a262eeb62fb2134 Mar 10 00:20:21 crc kubenswrapper[4994]: I0310 00:20:21.601314 4994 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:20:22 crc kubenswrapper[4994]: I0310 00:20:22.374980 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" event={"ID":"47cdbf6c-4d30-40c5-8af9-6685cd711b7a","Type":"ContainerStarted","Data":"930ca6ed0337859ec704828a7f5c8d2fc8b22ef59722422e4a262eeb62fb2134"} Mar 10 00:20:22 crc kubenswrapper[4994]: E0310 00:20:22.377291 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" Mar 10 00:20:22 crc kubenswrapper[4994]: I0310 00:20:22.495643 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:20:22 crc kubenswrapper[4994]: I0310 00:20:22.528520 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:20:23 crc kubenswrapper[4994]: E0310 00:20:23.382186 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" Mar 10 00:20:24 crc kubenswrapper[4994]: E0310 00:20:24.423046 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" Mar 10 00:20:25 crc kubenswrapper[4994]: I0310 00:20:25.398426 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" event={"ID":"47cdbf6c-4d30-40c5-8af9-6685cd711b7a","Type":"ContainerStarted","Data":"7331001565f4232abe5c69d7691e595fa5f9b278916200892cff4d382d0f652d"} Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.165139 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" podStartSLOduration=14.329187293 podStartE2EDuration="17.165124292s" podCreationTimestamp="2026-03-10 00:20:12 +0000 UTC" firstStartedPulling="2026-03-10 00:20:21.601108994 +0000 UTC m=+835.774815743" lastFinishedPulling="2026-03-10 00:20:24.437045993 +0000 UTC m=+838.610752742" observedRunningTime="2026-03-10 00:20:25.431302984 +0000 UTC m=+839.605009743" watchObservedRunningTime="2026-03-10 00:20:29.165124292 +0000 UTC m=+843.338831041" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.168952 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-6qgfs"] Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.169583 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.171062 4994 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vzkn9" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.173238 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.173365 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.180852 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-6qgfs"] Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.231722 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef38e78a-b3a6-4de7-ba46-598693edf905-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-6qgfs\" (UID: \"ef38e78a-b3a6-4de7-ba46-598693edf905\") " pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.231805 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfb4m\" (UniqueName: \"kubernetes.io/projected/ef38e78a-b3a6-4de7-ba46-598693edf905-kube-api-access-kfb4m\") pod \"cert-manager-webhook-6888856db4-6qgfs\" (UID: \"ef38e78a-b3a6-4de7-ba46-598693edf905\") " pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.342271 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef38e78a-b3a6-4de7-ba46-598693edf905-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-6qgfs\" (UID: \"ef38e78a-b3a6-4de7-ba46-598693edf905\") " pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.342348 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfb4m\" (UniqueName: \"kubernetes.io/projected/ef38e78a-b3a6-4de7-ba46-598693edf905-kube-api-access-kfb4m\") pod \"cert-manager-webhook-6888856db4-6qgfs\" (UID: \"ef38e78a-b3a6-4de7-ba46-598693edf905\") " pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.363974 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfb4m\" (UniqueName: \"kubernetes.io/projected/ef38e78a-b3a6-4de7-ba46-598693edf905-kube-api-access-kfb4m\") pod \"cert-manager-webhook-6888856db4-6qgfs\" (UID: \"ef38e78a-b3a6-4de7-ba46-598693edf905\") " pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.371384 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef38e78a-b3a6-4de7-ba46-598693edf905-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-6qgfs\" (UID: \"ef38e78a-b3a6-4de7-ba46-598693edf905\") " pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.486387 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.853768 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-6qgfs"] Mar 10 00:20:29 crc kubenswrapper[4994]: W0310 00:20:29.865118 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef38e78a_b3a6_4de7_ba46_598693edf905.slice/crio-19de56151c60af0a6346cfa30b5a994bcf4929f49c8203a3eb8905279f118ed4 WatchSource:0}: Error finding container 19de56151c60af0a6346cfa30b5a994bcf4929f49c8203a3eb8905279f118ed4: Status 404 returned error can't find the container with id 19de56151c60af0a6346cfa30b5a994bcf4929f49c8203a3eb8905279f118ed4 Mar 10 00:20:30 crc kubenswrapper[4994]: I0310 00:20:30.431232 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" event={"ID":"ef38e78a-b3a6-4de7-ba46-598693edf905","Type":"ContainerStarted","Data":"19de56151c60af0a6346cfa30b5a994bcf4929f49c8203a3eb8905279f118ed4"} Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.933232 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.934196 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.936815 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.936903 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.936997 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.944367 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.956552 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.964269 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8qd55"] Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.964966 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.968406 4994 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-n56gh" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.992076 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8qd55"] Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.075837 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.075902 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.075934 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.075963 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076024 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5kq7\" (UniqueName: \"kubernetes.io/projected/c5f1e9a7-bff0-4565-9cef-d8904908dbfe-kube-api-access-x5kq7\") pod \"cert-manager-cainjector-5545bd876-8qd55\" (UID: \"c5f1e9a7-bff0-4565-9cef-d8904908dbfe\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076048 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076063 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076091 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t58hq\" (UniqueName: \"kubernetes.io/projected/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-kube-api-access-t58hq\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076113 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f1e9a7-bff0-4565-9cef-d8904908dbfe-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8qd55\" (UID: \"c5f1e9a7-bff0-4565-9cef-d8904908dbfe\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076138 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076156 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076200 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076219 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076275 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177300 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177360 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177392 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177435 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5kq7\" (UniqueName: \"kubernetes.io/projected/c5f1e9a7-bff0-4565-9cef-d8904908dbfe-kube-api-access-x5kq7\") pod \"cert-manager-cainjector-5545bd876-8qd55\" (UID: \"c5f1e9a7-bff0-4565-9cef-d8904908dbfe\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177461 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177477 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177506 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t58hq\" (UniqueName: \"kubernetes.io/projected/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-kube-api-access-t58hq\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177528 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f1e9a7-bff0-4565-9cef-d8904908dbfe-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8qd55\" (UID: \"c5f1e9a7-bff0-4565-9cef-d8904908dbfe\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177553 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177573 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177600 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177621 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177629 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177638 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177694 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177779 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.178303 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.178441 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.178618 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.178716 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.178925 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.179133 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.179218 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.183035 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.183749 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.200377 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t58hq\" (UniqueName: \"kubernetes.io/projected/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-kube-api-access-t58hq\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.206447 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5kq7\" (UniqueName: \"kubernetes.io/projected/c5f1e9a7-bff0-4565-9cef-d8904908dbfe-kube-api-access-x5kq7\") pod \"cert-manager-cainjector-5545bd876-8qd55\" (UID: \"c5f1e9a7-bff0-4565-9cef-d8904908dbfe\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.207099 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f1e9a7-bff0-4565-9cef-d8904908dbfe-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8qd55\" (UID: \"c5f1e9a7-bff0-4565-9cef-d8904908dbfe\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.258757 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.288030 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.677571 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:32 crc kubenswrapper[4994]: W0310 00:20:32.680586 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6fc7b86_bebf_4721_a8c1_88169e4ec64e.slice/crio-393421dea70664559a45ce4c485c1e4b38594dd2141443b113ceecaeb78599d7 WatchSource:0}: Error finding container 393421dea70664559a45ce4c485c1e4b38594dd2141443b113ceecaeb78599d7: Status 404 returned error can't find the container with id 393421dea70664559a45ce4c485c1e4b38594dd2141443b113ceecaeb78599d7 Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.726595 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8qd55"] Mar 10 00:20:32 crc kubenswrapper[4994]: W0310 00:20:32.730150 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5f1e9a7_bff0_4565_9cef_d8904908dbfe.slice/crio-0d070f4f7280c5a4bbb9dfba3e1b7cffa5d7980f21d312f77a2c73001fc4394a WatchSource:0}: Error finding container 0d070f4f7280c5a4bbb9dfba3e1b7cffa5d7980f21d312f77a2c73001fc4394a: Status 404 returned error can't find the container with id 0d070f4f7280c5a4bbb9dfba3e1b7cffa5d7980f21d312f77a2c73001fc4394a Mar 10 00:20:33 crc kubenswrapper[4994]: I0310 00:20:33.453370 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"a6fc7b86-bebf-4721-a8c1-88169e4ec64e","Type":"ContainerStarted","Data":"393421dea70664559a45ce4c485c1e4b38594dd2141443b113ceecaeb78599d7"} Mar 10 00:20:33 crc kubenswrapper[4994]: I0310 00:20:33.455023 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" event={"ID":"c5f1e9a7-bff0-4565-9cef-d8904908dbfe","Type":"ContainerStarted","Data":"0d070f4f7280c5a4bbb9dfba3e1b7cffa5d7980f21d312f77a2c73001fc4394a"} Mar 10 00:20:35 crc kubenswrapper[4994]: I0310 00:20:35.467899 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" event={"ID":"ef38e78a-b3a6-4de7-ba46-598693edf905","Type":"ContainerStarted","Data":"42aaa159671125a9e9bf07e6b24b5abfcafb509ce8769303c9d9b2f61858021a"} Mar 10 00:20:35 crc kubenswrapper[4994]: I0310 00:20:35.468337 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:35 crc kubenswrapper[4994]: I0310 00:20:35.468979 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" event={"ID":"c5f1e9a7-bff0-4565-9cef-d8904908dbfe","Type":"ContainerStarted","Data":"25f5126b3cbd60397c83cb729a3dc4b2450515dd72b0b24108d7d39bdc9ceb26"} Mar 10 00:20:35 crc kubenswrapper[4994]: I0310 00:20:35.485722 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" podStartSLOduration=1.519395747 podStartE2EDuration="6.485702052s" podCreationTimestamp="2026-03-10 00:20:29 +0000 UTC" firstStartedPulling="2026-03-10 00:20:29.87408083 +0000 UTC m=+844.047787579" lastFinishedPulling="2026-03-10 00:20:34.840387145 +0000 UTC m=+849.014093884" observedRunningTime="2026-03-10 00:20:35.484947452 +0000 UTC m=+849.658654241" watchObservedRunningTime="2026-03-10 00:20:35.485702052 +0000 UTC m=+849.659408801" Mar 10 00:20:35 crc kubenswrapper[4994]: I0310 00:20:35.514612 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" podStartSLOduration=2.426832846 podStartE2EDuration="4.514588999s" podCreationTimestamp="2026-03-10 00:20:31 +0000 UTC" firstStartedPulling="2026-03-10 00:20:32.732317036 +0000 UTC m=+846.906023785" lastFinishedPulling="2026-03-10 00:20:34.820073159 +0000 UTC m=+848.993779938" observedRunningTime="2026-03-10 00:20:35.505243448 +0000 UTC m=+849.678950197" watchObservedRunningTime="2026-03-10 00:20:35.514588999 +0000 UTC m=+849.688295748" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.489204 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.577470 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-jjkfq"] Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.579421 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.582522 4994 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-r8kvs" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.590529 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-jjkfq"] Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.685261 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/943085e6-2580-48ae-9c2d-d83989c6204c-bound-sa-token\") pod \"cert-manager-545d4d4674-jjkfq\" (UID: \"943085e6-2580-48ae-9c2d-d83989c6204c\") " pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.685393 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5z9b\" (UniqueName: \"kubernetes.io/projected/943085e6-2580-48ae-9c2d-d83989c6204c-kube-api-access-d5z9b\") pod \"cert-manager-545d4d4674-jjkfq\" (UID: \"943085e6-2580-48ae-9c2d-d83989c6204c\") " pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.787155 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5z9b\" (UniqueName: \"kubernetes.io/projected/943085e6-2580-48ae-9c2d-d83989c6204c-kube-api-access-d5z9b\") pod \"cert-manager-545d4d4674-jjkfq\" (UID: \"943085e6-2580-48ae-9c2d-d83989c6204c\") " pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.787366 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/943085e6-2580-48ae-9c2d-d83989c6204c-bound-sa-token\") pod \"cert-manager-545d4d4674-jjkfq\" (UID: \"943085e6-2580-48ae-9c2d-d83989c6204c\") " pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.807803 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/943085e6-2580-48ae-9c2d-d83989c6204c-bound-sa-token\") pod \"cert-manager-545d4d4674-jjkfq\" (UID: \"943085e6-2580-48ae-9c2d-d83989c6204c\") " pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.814540 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5z9b\" (UniqueName: \"kubernetes.io/projected/943085e6-2580-48ae-9c2d-d83989c6204c-kube-api-access-d5z9b\") pod \"cert-manager-545d4d4674-jjkfq\" (UID: \"943085e6-2580-48ae-9c2d-d83989c6204c\") " pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.903960 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:40 crc kubenswrapper[4994]: I0310 00:20:40.716461 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-jjkfq"] Mar 10 00:20:40 crc kubenswrapper[4994]: W0310 00:20:40.724111 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod943085e6_2580_48ae_9c2d_d83989c6204c.slice/crio-f37acacd45227243fd294bf3fa4e8c7791f9bdef5804af28f1c92bf0d5db623d WatchSource:0}: Error finding container f37acacd45227243fd294bf3fa4e8c7791f9bdef5804af28f1c92bf0d5db623d: Status 404 returned error can't find the container with id f37acacd45227243fd294bf3fa4e8c7791f9bdef5804af28f1c92bf0d5db623d Mar 10 00:20:41 crc kubenswrapper[4994]: I0310 00:20:41.510486 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7","Type":"ContainerStarted","Data":"58298e7bf03a06c8764bfd97d0b2c25a7a168fdf40a89f470e9683bafd16507e"} Mar 10 00:20:41 crc kubenswrapper[4994]: I0310 00:20:41.512422 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-jjkfq" event={"ID":"943085e6-2580-48ae-9c2d-d83989c6204c","Type":"ContainerStarted","Data":"66d5b3e4394353a801c9f18585ccca284d9493d9da7ad89a57addd3971112512"} Mar 10 00:20:41 crc kubenswrapper[4994]: I0310 00:20:41.512462 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-jjkfq" event={"ID":"943085e6-2580-48ae-9c2d-d83989c6204c","Type":"ContainerStarted","Data":"f37acacd45227243fd294bf3fa4e8c7791f9bdef5804af28f1c92bf0d5db623d"} Mar 10 00:20:41 crc kubenswrapper[4994]: I0310 00:20:41.514129 4994 generic.go:334] "Generic (PLEG): container finished" podID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerID="9cca42badce94fad376287336f15b62eab0ecbaa17338ad7903e0ad928102f3f" exitCode=0 Mar 10 00:20:41 crc kubenswrapper[4994]: I0310 00:20:41.514160 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"a6fc7b86-bebf-4721-a8c1-88169e4ec64e","Type":"ContainerDied","Data":"9cca42badce94fad376287336f15b62eab0ecbaa17338ad7903e0ad928102f3f"} Mar 10 00:20:41 crc kubenswrapper[4994]: I0310 00:20:41.595999 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-jjkfq" podStartSLOduration=2.595975096 podStartE2EDuration="2.595975096s" podCreationTimestamp="2026-03-10 00:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:20:41.594015343 +0000 UTC m=+855.767722142" watchObservedRunningTime="2026-03-10 00:20:41.595975096 +0000 UTC m=+855.769681865" Mar 10 00:20:42 crc kubenswrapper[4994]: I0310 00:20:42.000521 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:42 crc kubenswrapper[4994]: I0310 00:20:42.529231 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"a6fc7b86-bebf-4721-a8c1-88169e4ec64e","Type":"ContainerStarted","Data":"578327f2ef85774f35fe8c4237ddd2aa89ef148dfeb38a27116f7a82f23c231b"} Mar 10 00:20:42 crc kubenswrapper[4994]: I0310 00:20:42.531710 4994 generic.go:334] "Generic (PLEG): container finished" podID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" containerID="58298e7bf03a06c8764bfd97d0b2c25a7a168fdf40a89f470e9683bafd16507e" exitCode=0 Mar 10 00:20:42 crc kubenswrapper[4994]: I0310 00:20:42.531820 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7","Type":"ContainerDied","Data":"58298e7bf03a06c8764bfd97d0b2c25a7a168fdf40a89f470e9683bafd16507e"} Mar 10 00:20:42 crc kubenswrapper[4994]: I0310 00:20:42.570028 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=3.869946832 podStartE2EDuration="11.570000664s" podCreationTimestamp="2026-03-10 00:20:31 +0000 UTC" firstStartedPulling="2026-03-10 00:20:32.684270544 +0000 UTC m=+846.857977293" lastFinishedPulling="2026-03-10 00:20:40.384324376 +0000 UTC m=+854.558031125" observedRunningTime="2026-03-10 00:20:42.568538174 +0000 UTC m=+856.742244983" watchObservedRunningTime="2026-03-10 00:20:42.570000664 +0000 UTC m=+856.743707453" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.538497 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerName="docker-build" containerID="cri-o://578327f2ef85774f35fe8c4237ddd2aa89ef148dfeb38a27116f7a82f23c231b" gracePeriod=30 Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.686750 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.687809 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.693350 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.693480 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.693530 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.722438 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751663 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751720 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751758 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751781 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751814 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751911 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751940 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751963 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdv98\" (UniqueName: \"kubernetes.io/projected/46c9545f-e40a-413b-834d-b428cedc906b-kube-api-access-pdv98\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.752030 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.752055 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.752592 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.752720 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853567 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853612 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853632 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853655 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853676 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853692 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853711 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853727 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853747 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853776 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853791 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853807 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdv98\" (UniqueName: \"kubernetes.io/projected/46c9545f-e40a-413b-834d-b428cedc906b-kube-api-access-pdv98\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.854009 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.854026 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.854466 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.854583 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.854700 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.854801 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.855108 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.855259 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.855360 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.867035 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.867274 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.868635 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdv98\" (UniqueName: \"kubernetes.io/projected/46c9545f-e40a-413b-834d-b428cedc906b-kube-api-access-pdv98\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:44 crc kubenswrapper[4994]: I0310 00:20:44.001992 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:46 crc kubenswrapper[4994]: I0310 00:20:46.386800 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 10 00:20:46 crc kubenswrapper[4994]: I0310 00:20:46.559982 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerStarted","Data":"d087a7e67b101576e99855a78361be7581843febff904371846d8544b91e846f"} Mar 10 00:20:48 crc kubenswrapper[4994]: I0310 00:20:48.592051 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerStarted","Data":"5f529cfcb855ffbe234aadb07f23381b9405d00ba132dfa647c6f38b1781ad1a"} Mar 10 00:20:48 crc kubenswrapper[4994]: I0310 00:20:48.594689 4994 generic.go:334] "Generic (PLEG): container finished" podID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" containerID="54db6cc7ea3d8404e3f2f0fb896192a9783811677c1a276c79cd3f8eb405c3de" exitCode=0 Mar 10 00:20:48 crc kubenswrapper[4994]: I0310 00:20:48.594737 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7","Type":"ContainerDied","Data":"54db6cc7ea3d8404e3f2f0fb896192a9783811677c1a276c79cd3f8eb405c3de"} Mar 10 00:20:48 crc kubenswrapper[4994]: I0310 00:20:48.893254 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:20:48 crc kubenswrapper[4994]: I0310 00:20:48.894030 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:20:49 crc kubenswrapper[4994]: I0310 00:20:49.610065 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_a6fc7b86-bebf-4721-a8c1-88169e4ec64e/docker-build/0.log" Mar 10 00:20:49 crc kubenswrapper[4994]: I0310 00:20:49.611356 4994 generic.go:334] "Generic (PLEG): container finished" podID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerID="578327f2ef85774f35fe8c4237ddd2aa89ef148dfeb38a27116f7a82f23c231b" exitCode=1 Mar 10 00:20:49 crc kubenswrapper[4994]: I0310 00:20:49.611469 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"a6fc7b86-bebf-4721-a8c1-88169e4ec64e","Type":"ContainerDied","Data":"578327f2ef85774f35fe8c4237ddd2aa89ef148dfeb38a27116f7a82f23c231b"} Mar 10 00:20:49 crc kubenswrapper[4994]: I0310 00:20:49.616212 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7","Type":"ContainerStarted","Data":"d5ebb9902f4cc566f26df996f576738a2abe32f828dc16e6f469eda8c02270a9"} Mar 10 00:20:49 crc kubenswrapper[4994]: I0310 00:20:49.616499 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:20:49 crc kubenswrapper[4994]: I0310 00:20:49.707124 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=14.025372242 podStartE2EDuration="54.707102824s" podCreationTimestamp="2026-03-10 00:19:55 +0000 UTC" firstStartedPulling="2026-03-10 00:19:59.671642702 +0000 UTC m=+813.845349461" lastFinishedPulling="2026-03-10 00:20:40.353373284 +0000 UTC m=+854.527080043" observedRunningTime="2026-03-10 00:20:49.694640869 +0000 UTC m=+863.868347628" watchObservedRunningTime="2026-03-10 00:20:49.707102824 +0000 UTC m=+863.880809583" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.003327 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_a6fc7b86-bebf-4721-a8c1-88169e4ec64e/docker-build/0.log" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.004193 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161080 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-blob-cache\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161120 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-run\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161150 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-node-pullsecrets\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161181 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildworkdir\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161212 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-push\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161260 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161276 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t58hq\" (UniqueName: \"kubernetes.io/projected/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-kube-api-access-t58hq\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161310 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildcachedir\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161345 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-root\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161385 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-pull\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161413 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-ca-bundles\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161436 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-proxy-ca-bundles\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161458 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-system-configs\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161615 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161785 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161849 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161847 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161883 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.162204 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.162338 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.162365 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.162615 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.162815 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.166756 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.167037 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-kube-api-access-t58hq" (OuterVolumeSpecName: "kube-api-access-t58hq") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "kube-api-access-t58hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.167348 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263145 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263187 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263204 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263214 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263229 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263238 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263250 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263262 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263270 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263278 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t58hq\" (UniqueName: \"kubernetes.io/projected/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-kube-api-access-t58hq\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.622396 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_a6fc7b86-bebf-4721-a8c1-88169e4ec64e/docker-build/0.log" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.623127 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.623127 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"a6fc7b86-bebf-4721-a8c1-88169e4ec64e","Type":"ContainerDied","Data":"393421dea70664559a45ce4c485c1e4b38594dd2141443b113ceecaeb78599d7"} Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.623190 4994 scope.go:117] "RemoveContainer" containerID="578327f2ef85774f35fe8c4237ddd2aa89ef148dfeb38a27116f7a82f23c231b" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.653132 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.656577 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.662007 4994 scope.go:117] "RemoveContainer" containerID="9cca42badce94fad376287336f15b62eab0ecbaa17338ad7903e0ad928102f3f" Mar 10 00:20:52 crc kubenswrapper[4994]: I0310 00:20:52.568614 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" path="/var/lib/kubelet/pods/a6fc7b86-bebf-4721-a8c1-88169e4ec64e/volumes" Mar 10 00:20:59 crc kubenswrapper[4994]: I0310 00:20:59.710272 4994 generic.go:334] "Generic (PLEG): container finished" podID="46c9545f-e40a-413b-834d-b428cedc906b" containerID="5f529cfcb855ffbe234aadb07f23381b9405d00ba132dfa647c6f38b1781ad1a" exitCode=0 Mar 10 00:20:59 crc kubenswrapper[4994]: I0310 00:20:59.710909 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerDied","Data":"5f529cfcb855ffbe234aadb07f23381b9405d00ba132dfa647c6f38b1781ad1a"} Mar 10 00:21:00 crc kubenswrapper[4994]: I0310 00:21:00.722263 4994 generic.go:334] "Generic (PLEG): container finished" podID="46c9545f-e40a-413b-834d-b428cedc906b" containerID="6c574b09f77fce2d6065cbbc628a9fe5b857fe1528785e7e247d3386857f7208" exitCode=0 Mar 10 00:21:00 crc kubenswrapper[4994]: I0310 00:21:00.722333 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerDied","Data":"6c574b09f77fce2d6065cbbc628a9fe5b857fe1528785e7e247d3386857f7208"} Mar 10 00:21:00 crc kubenswrapper[4994]: I0310 00:21:00.793189 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_46c9545f-e40a-413b-834d-b428cedc906b/manage-dockerfile/0.log" Mar 10 00:21:01 crc kubenswrapper[4994]: I0310 00:21:01.252550 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" containerName="elasticsearch" probeResult="failure" output=< Mar 10 00:21:01 crc kubenswrapper[4994]: {"timestamp": "2026-03-10T00:21:01+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 10 00:21:01 crc kubenswrapper[4994]: > Mar 10 00:21:01 crc kubenswrapper[4994]: I0310 00:21:01.732926 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerStarted","Data":"7902da0dd9911d9b653dd31e0ee68587f1799979b11fd8b932f59d984685714a"} Mar 10 00:21:06 crc kubenswrapper[4994]: I0310 00:21:06.576585 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:21:06 crc kubenswrapper[4994]: I0310 00:21:06.615764 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=23.615741845 podStartE2EDuration="23.615741845s" podCreationTimestamp="2026-03-10 00:20:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:21:01.781036349 +0000 UTC m=+875.954743098" watchObservedRunningTime="2026-03-10 00:21:06.615741845 +0000 UTC m=+880.789448594" Mar 10 00:21:18 crc kubenswrapper[4994]: I0310 00:21:18.892751 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:21:18 crc kubenswrapper[4994]: I0310 00:21:18.893258 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:21:48 crc kubenswrapper[4994]: I0310 00:21:48.892132 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:21:48 crc kubenswrapper[4994]: I0310 00:21:48.892735 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:21:48 crc kubenswrapper[4994]: I0310 00:21:48.892795 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:21:48 crc kubenswrapper[4994]: I0310 00:21:48.893645 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:21:48 crc kubenswrapper[4994]: I0310 00:21:48.893730 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db" gracePeriod=600 Mar 10 00:21:49 crc kubenswrapper[4994]: E0310 00:21:49.001182 4994 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podced5d66d_39df_4267_b801_e1e60d517ace.slice/crio-226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db.scope\": RecentStats: unable to find data in memory cache]" Mar 10 00:21:49 crc kubenswrapper[4994]: I0310 00:21:49.085332 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db" exitCode=0 Mar 10 00:21:49 crc kubenswrapper[4994]: I0310 00:21:49.085394 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db"} Mar 10 00:21:49 crc kubenswrapper[4994]: I0310 00:21:49.085648 4994 scope.go:117] "RemoveContainer" containerID="3a58b30808ba3fd3b4a9259ace5f595c7d1bd5910098d24eb4a7ede149499cfa" Mar 10 00:21:50 crc kubenswrapper[4994]: I0310 00:21:50.097697 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"4df6fccd58598d7ef69c17858b0dc84c63fc5b8f887c3889c08c1cb8b68f120e"} Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.151275 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551702-qz8hb"] Mar 10 00:22:00 crc kubenswrapper[4994]: E0310 00:22:00.152259 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerName="manage-dockerfile" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.152280 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerName="manage-dockerfile" Mar 10 00:22:00 crc kubenswrapper[4994]: E0310 00:22:00.152311 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerName="docker-build" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.152323 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerName="docker-build" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.152507 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerName="docker-build" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.153144 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.157058 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.159935 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.160023 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.162223 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551702-qz8hb"] Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.179711 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd2zd\" (UniqueName: \"kubernetes.io/projected/0dbd271f-5f29-4221-bfbe-2274ce440c29-kube-api-access-kd2zd\") pod \"auto-csr-approver-29551702-qz8hb\" (UID: \"0dbd271f-5f29-4221-bfbe-2274ce440c29\") " pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.280446 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd2zd\" (UniqueName: \"kubernetes.io/projected/0dbd271f-5f29-4221-bfbe-2274ce440c29-kube-api-access-kd2zd\") pod \"auto-csr-approver-29551702-qz8hb\" (UID: \"0dbd271f-5f29-4221-bfbe-2274ce440c29\") " pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.302077 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd2zd\" (UniqueName: \"kubernetes.io/projected/0dbd271f-5f29-4221-bfbe-2274ce440c29-kube-api-access-kd2zd\") pod \"auto-csr-approver-29551702-qz8hb\" (UID: \"0dbd271f-5f29-4221-bfbe-2274ce440c29\") " pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.470328 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.685389 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551702-qz8hb"] Mar 10 00:22:01 crc kubenswrapper[4994]: I0310 00:22:01.187038 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" event={"ID":"0dbd271f-5f29-4221-bfbe-2274ce440c29","Type":"ContainerStarted","Data":"53856e54815dff5bb08b5f00e560176a1c7b6df0ea506a96af56d1a1e3dd8b10"} Mar 10 00:22:03 crc kubenswrapper[4994]: I0310 00:22:03.200173 4994 generic.go:334] "Generic (PLEG): container finished" podID="0dbd271f-5f29-4221-bfbe-2274ce440c29" containerID="579778a9ca15bde63cc28eb094f98e7025e92dc67fe3a2f215a9a028c2966910" exitCode=0 Mar 10 00:22:03 crc kubenswrapper[4994]: I0310 00:22:03.200317 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" event={"ID":"0dbd271f-5f29-4221-bfbe-2274ce440c29","Type":"ContainerDied","Data":"579778a9ca15bde63cc28eb094f98e7025e92dc67fe3a2f215a9a028c2966910"} Mar 10 00:22:04 crc kubenswrapper[4994]: I0310 00:22:04.521859 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:04 crc kubenswrapper[4994]: I0310 00:22:04.539806 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd2zd\" (UniqueName: \"kubernetes.io/projected/0dbd271f-5f29-4221-bfbe-2274ce440c29-kube-api-access-kd2zd\") pod \"0dbd271f-5f29-4221-bfbe-2274ce440c29\" (UID: \"0dbd271f-5f29-4221-bfbe-2274ce440c29\") " Mar 10 00:22:04 crc kubenswrapper[4994]: I0310 00:22:04.546736 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbd271f-5f29-4221-bfbe-2274ce440c29-kube-api-access-kd2zd" (OuterVolumeSpecName: "kube-api-access-kd2zd") pod "0dbd271f-5f29-4221-bfbe-2274ce440c29" (UID: "0dbd271f-5f29-4221-bfbe-2274ce440c29"). InnerVolumeSpecName "kube-api-access-kd2zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:22:04 crc kubenswrapper[4994]: I0310 00:22:04.641710 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd2zd\" (UniqueName: \"kubernetes.io/projected/0dbd271f-5f29-4221-bfbe-2274ce440c29-kube-api-access-kd2zd\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:05 crc kubenswrapper[4994]: I0310 00:22:05.218633 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" event={"ID":"0dbd271f-5f29-4221-bfbe-2274ce440c29","Type":"ContainerDied","Data":"53856e54815dff5bb08b5f00e560176a1c7b6df0ea506a96af56d1a1e3dd8b10"} Mar 10 00:22:05 crc kubenswrapper[4994]: I0310 00:22:05.218690 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53856e54815dff5bb08b5f00e560176a1c7b6df0ea506a96af56d1a1e3dd8b10" Mar 10 00:22:05 crc kubenswrapper[4994]: I0310 00:22:05.218696 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:05 crc kubenswrapper[4994]: I0310 00:22:05.604474 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551696-w9ztg"] Mar 10 00:22:05 crc kubenswrapper[4994]: I0310 00:22:05.612164 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551696-w9ztg"] Mar 10 00:22:06 crc kubenswrapper[4994]: I0310 00:22:06.564105 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25bd204-3572-4880-b74f-764a5a3e0123" path="/var/lib/kubelet/pods/f25bd204-3572-4880-b74f-764a5a3e0123/volumes" Mar 10 00:22:21 crc kubenswrapper[4994]: I0310 00:22:21.374467 4994 scope.go:117] "RemoveContainer" containerID="60b85c7f8cd24fb6dc7f7bf060fdb2ddff2c7fdefc6188b8ccedeba460b2b511" Mar 10 00:22:29 crc kubenswrapper[4994]: I0310 00:22:29.404469 4994 generic.go:334] "Generic (PLEG): container finished" podID="46c9545f-e40a-413b-834d-b428cedc906b" containerID="7902da0dd9911d9b653dd31e0ee68587f1799979b11fd8b932f59d984685714a" exitCode=0 Mar 10 00:22:29 crc kubenswrapper[4994]: I0310 00:22:29.405225 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerDied","Data":"7902da0dd9911d9b653dd31e0ee68587f1799979b11fd8b932f59d984685714a"} Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.763187 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.930667 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-build-blob-cache\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.930806 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-ca-bundles\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.930914 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-root\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931000 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-pull\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931071 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-system-configs\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931116 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdv98\" (UniqueName: \"kubernetes.io/projected/46c9545f-e40a-413b-834d-b428cedc906b-kube-api-access-pdv98\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931173 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-node-pullsecrets\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931237 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-buildcachedir\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931364 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931400 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931486 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-push\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931568 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-proxy-ca-bundles\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931686 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-buildworkdir\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931994 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.932039 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.932428 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.933038 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-run\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.933407 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.933434 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.933453 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.933471 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.933488 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.934650 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.944576 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.950051 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c9545f-e40a-413b-834d-b428cedc906b-kube-api-access-pdv98" (OuterVolumeSpecName: "kube-api-access-pdv98") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "kube-api-access-pdv98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.951061 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.993366 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.034324 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.034608 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.034690 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.034773 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.034846 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdv98\" (UniqueName: \"kubernetes.io/projected/46c9545f-e40a-413b-834d-b428cedc906b-kube-api-access-pdv98\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.129628 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.137004 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.424678 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerDied","Data":"d087a7e67b101576e99855a78361be7581843febff904371846d8544b91e846f"} Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.424743 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d087a7e67b101576e99855a78361be7581843febff904371846d8544b91e846f" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.424861 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:22:33 crc kubenswrapper[4994]: I0310 00:22:33.669335 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:33 crc kubenswrapper[4994]: I0310 00:22:33.677511 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142217 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:35 crc kubenswrapper[4994]: E0310 00:22:35.142554 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbd271f-5f29-4221-bfbe-2274ce440c29" containerName="oc" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142575 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbd271f-5f29-4221-bfbe-2274ce440c29" containerName="oc" Mar 10 00:22:35 crc kubenswrapper[4994]: E0310 00:22:35.142597 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="manage-dockerfile" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142609 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="manage-dockerfile" Mar 10 00:22:35 crc kubenswrapper[4994]: E0310 00:22:35.142635 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="git-clone" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142649 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="git-clone" Mar 10 00:22:35 crc kubenswrapper[4994]: E0310 00:22:35.142679 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="docker-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142691 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="docker-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142898 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dbd271f-5f29-4221-bfbe-2274ce440c29" containerName="oc" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142919 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="docker-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.144150 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.146800 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.146803 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.147476 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.148641 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.166605 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.199991 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200109 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4ttb\" (UniqueName: \"kubernetes.io/projected/10f5518f-f5ed-47fb-b443-fae9128aec81-kube-api-access-m4ttb\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200156 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200191 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200220 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200270 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200328 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200367 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200399 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200432 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200477 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200523 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.301747 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.301849 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4ttb\" (UniqueName: \"kubernetes.io/projected/10f5518f-f5ed-47fb-b443-fae9128aec81-kube-api-access-m4ttb\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.301916 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.301953 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.301985 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302037 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302093 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302131 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302164 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302196 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302242 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302462 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302852 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302961 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.303047 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.303304 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.303461 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.303751 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.304073 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.304255 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.304721 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.310701 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.313211 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.333718 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4ttb\" (UniqueName: \"kubernetes.io/projected/10f5518f-f5ed-47fb-b443-fae9128aec81-kube-api-access-m4ttb\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.469459 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.681524 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:36 crc kubenswrapper[4994]: I0310 00:22:36.470650 4994 generic.go:334] "Generic (PLEG): container finished" podID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerID="eb388fa374116b603771455669ec292f8665b546692a48ed5cd70354b73fe681" exitCode=0 Mar 10 00:22:36 crc kubenswrapper[4994]: I0310 00:22:36.470749 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"10f5518f-f5ed-47fb-b443-fae9128aec81","Type":"ContainerDied","Data":"eb388fa374116b603771455669ec292f8665b546692a48ed5cd70354b73fe681"} Mar 10 00:22:36 crc kubenswrapper[4994]: I0310 00:22:36.471071 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"10f5518f-f5ed-47fb-b443-fae9128aec81","Type":"ContainerStarted","Data":"2a6c5f55e005a4b106f0f86645e6873333919851ee6a45e984f93fbe7bf04a2e"} Mar 10 00:22:37 crc kubenswrapper[4994]: I0310 00:22:37.482268 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"10f5518f-f5ed-47fb-b443-fae9128aec81","Type":"ContainerStarted","Data":"5b7a38adc0c6a78554c415c32e9eb6f227d97fceb20d2f07317cb0b1675be632"} Mar 10 00:22:37 crc kubenswrapper[4994]: I0310 00:22:37.520320 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=2.520295173 podStartE2EDuration="2.520295173s" podCreationTimestamp="2026-03-10 00:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:22:37.511956998 +0000 UTC m=+971.685663767" watchObservedRunningTime="2026-03-10 00:22:37.520295173 +0000 UTC m=+971.694001962" Mar 10 00:22:46 crc kubenswrapper[4994]: I0310 00:22:46.061542 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:46 crc kubenswrapper[4994]: I0310 00:22:46.062483 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerName="docker-build" containerID="cri-o://5b7a38adc0c6a78554c415c32e9eb6f227d97fceb20d2f07317cb0b1675be632" gracePeriod=30 Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.560709 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_10f5518f-f5ed-47fb-b443-fae9128aec81/docker-build/0.log" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.561505 4994 generic.go:334] "Generic (PLEG): container finished" podID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerID="5b7a38adc0c6a78554c415c32e9eb6f227d97fceb20d2f07317cb0b1675be632" exitCode=1 Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.561555 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"10f5518f-f5ed-47fb-b443-fae9128aec81","Type":"ContainerDied","Data":"5b7a38adc0c6a78554c415c32e9eb6f227d97fceb20d2f07317cb0b1675be632"} Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.561637 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"10f5518f-f5ed-47fb-b443-fae9128aec81","Type":"ContainerDied","Data":"2a6c5f55e005a4b106f0f86645e6873333919851ee6a45e984f93fbe7bf04a2e"} Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.561660 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a6c5f55e005a4b106f0f86645e6873333919851ee6a45e984f93fbe7bf04a2e" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.608719 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_10f5518f-f5ed-47fb-b443-fae9128aec81/docker-build/0.log" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.609363 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718340 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-ca-bundles\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718416 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-buildcachedir\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718449 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-pull\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718481 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-proxy-ca-bundles\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718508 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-buildworkdir\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718579 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-push\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718606 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-system-configs\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718645 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-node-pullsecrets\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718822 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.719286 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.719328 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.719365 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.719928 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.721050 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718680 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4ttb\" (UniqueName: \"kubernetes.io/projected/10f5518f-f5ed-47fb-b443-fae9128aec81-kube-api-access-m4ttb\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.721176 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-root\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.721209 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-run\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.721241 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-build-blob-cache\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722370 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722810 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722829 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722842 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722853 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722864 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722898 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722913 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.723342 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 10 00:22:47 crc kubenswrapper[4994]: E0310 00:22:47.723738 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerName="docker-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.723773 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerName="docker-build" Mar 10 00:22:47 crc kubenswrapper[4994]: E0310 00:22:47.723796 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerName="manage-dockerfile" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.723805 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerName="manage-dockerfile" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.723950 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerName="docker-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.724773 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.726468 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.727007 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.736507 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.736767 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.736949 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.742163 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.746097 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f5518f-f5ed-47fb-b443-fae9128aec81-kube-api-access-m4ttb" (OuterVolumeSpecName: "kube-api-access-m4ttb") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "kube-api-access-m4ttb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824617 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824694 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824745 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824793 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824838 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824899 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824962 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825001 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825072 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825122 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825170 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825337 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thrgh\" (UniqueName: \"kubernetes.io/projected/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-kube-api-access-thrgh\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825502 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825528 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4ttb\" (UniqueName: \"kubernetes.io/projected/10f5518f-f5ed-47fb-b443-fae9128aec81-kube-api-access-m4ttb\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825543 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.916905 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926253 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926345 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926403 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926482 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926535 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926573 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.927773 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.927839 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.927163 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.927839 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.928105 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thrgh\" (UniqueName: \"kubernetes.io/projected/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-kube-api-access-thrgh\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926662 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.928299 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.927068 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.928719 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.928750 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.928854 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.929033 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.929103 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.929226 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.929319 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.929945 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.948176 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.948186 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.952965 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thrgh\" (UniqueName: \"kubernetes.io/projected/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-kube-api-access-thrgh\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.087433 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.158114 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.234463 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.323998 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.572087 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerStarted","Data":"c8f48d2ddd1148a083fa4a3e0e28f96eeb589b9ec52ced72fd24a2fc874d5874"} Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.572120 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.632955 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.642949 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:49 crc kubenswrapper[4994]: I0310 00:22:49.583173 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerStarted","Data":"bc29d20723fae562ff49599b98ec8b08c65f2481204182fd5331b443e783e6ed"} Mar 10 00:22:50 crc kubenswrapper[4994]: I0310 00:22:50.567171 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" path="/var/lib/kubelet/pods/10f5518f-f5ed-47fb-b443-fae9128aec81/volumes" Mar 10 00:22:50 crc kubenswrapper[4994]: I0310 00:22:50.594185 4994 generic.go:334] "Generic (PLEG): container finished" podID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerID="bc29d20723fae562ff49599b98ec8b08c65f2481204182fd5331b443e783e6ed" exitCode=0 Mar 10 00:22:50 crc kubenswrapper[4994]: I0310 00:22:50.594246 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerDied","Data":"bc29d20723fae562ff49599b98ec8b08c65f2481204182fd5331b443e783e6ed"} Mar 10 00:22:51 crc kubenswrapper[4994]: I0310 00:22:51.607237 4994 generic.go:334] "Generic (PLEG): container finished" podID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerID="29ab98ad95069b94cd163c490a7f86c25c7ed6375e4010a92852a665528555c8" exitCode=0 Mar 10 00:22:51 crc kubenswrapper[4994]: I0310 00:22:51.607310 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerDied","Data":"29ab98ad95069b94cd163c490a7f86c25c7ed6375e4010a92852a665528555c8"} Mar 10 00:22:51 crc kubenswrapper[4994]: I0310 00:22:51.649571 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_998fc9e9-e892-4f8f-9c76-6a6129ab98e7/manage-dockerfile/0.log" Mar 10 00:22:52 crc kubenswrapper[4994]: I0310 00:22:52.618001 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerStarted","Data":"54bdbba3bbbc238dc5b0ebf2dbf751d59ee9f9abc3b40893b0cd282f9a3228f6"} Mar 10 00:22:52 crc kubenswrapper[4994]: I0310 00:22:52.659553 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=5.659529983 podStartE2EDuration="5.659529983s" podCreationTimestamp="2026-03-10 00:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:22:52.6505043 +0000 UTC m=+986.824211079" watchObservedRunningTime="2026-03-10 00:22:52.659529983 +0000 UTC m=+986.833236762" Mar 10 00:23:56 crc kubenswrapper[4994]: I0310 00:23:56.062847 4994 generic.go:334] "Generic (PLEG): container finished" podID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerID="54bdbba3bbbc238dc5b0ebf2dbf751d59ee9f9abc3b40893b0cd282f9a3228f6" exitCode=0 Mar 10 00:23:56 crc kubenswrapper[4994]: I0310 00:23:56.062986 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerDied","Data":"54bdbba3bbbc238dc5b0ebf2dbf751d59ee9f9abc3b40893b0cd282f9a3228f6"} Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.381849 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494159 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-node-pullsecrets\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494323 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494548 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildworkdir\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494714 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-run\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494780 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-ca-bundles\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494839 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-blob-cache\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494948 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-system-configs\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494986 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildcachedir\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495048 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thrgh\" (UniqueName: \"kubernetes.io/projected/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-kube-api-access-thrgh\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495091 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-proxy-ca-bundles\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495157 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-push\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495183 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495245 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-root\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495331 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-pull\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495730 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495798 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495968 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495984 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495994 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.496002 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.496066 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.496336 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.500921 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.509505 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.509540 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-kube-api-access-thrgh" (OuterVolumeSpecName: "kube-api-access-thrgh") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "kube-api-access-thrgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.509540 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.598944 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thrgh\" (UniqueName: \"kubernetes.io/projected/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-kube-api-access-thrgh\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.599024 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.599038 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.599051 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.599062 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.599076 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.746433 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.803704 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:58 crc kubenswrapper[4994]: I0310 00:23:58.080536 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerDied","Data":"c8f48d2ddd1148a083fa4a3e0e28f96eeb589b9ec52ced72fd24a2fc874d5874"} Mar 10 00:23:58 crc kubenswrapper[4994]: I0310 00:23:58.080584 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8f48d2ddd1148a083fa4a3e0e28f96eeb589b9ec52ced72fd24a2fc874d5874" Mar 10 00:23:58 crc kubenswrapper[4994]: I0310 00:23:58.080673 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:23:59 crc kubenswrapper[4994]: I0310 00:23:59.537898 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:59 crc kubenswrapper[4994]: I0310 00:23:59.630755 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.156351 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551704-8n96k"] Mar 10 00:24:00 crc kubenswrapper[4994]: E0310 00:24:00.157365 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="docker-build" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.157395 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="docker-build" Mar 10 00:24:00 crc kubenswrapper[4994]: E0310 00:24:00.157462 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="git-clone" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.157479 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="git-clone" Mar 10 00:24:00 crc kubenswrapper[4994]: E0310 00:24:00.157524 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="manage-dockerfile" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.157543 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="manage-dockerfile" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.158097 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="docker-build" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.159549 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.164894 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.171065 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.171065 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.174169 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551704-8n96k"] Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.339958 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-754pt\" (UniqueName: \"kubernetes.io/projected/ca1779b4-8945-4667-b086-b7481edf1099-kube-api-access-754pt\") pod \"auto-csr-approver-29551704-8n96k\" (UID: \"ca1779b4-8945-4667-b086-b7481edf1099\") " pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.443520 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-754pt\" (UniqueName: \"kubernetes.io/projected/ca1779b4-8945-4667-b086-b7481edf1099-kube-api-access-754pt\") pod \"auto-csr-approver-29551704-8n96k\" (UID: \"ca1779b4-8945-4667-b086-b7481edf1099\") " pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.473474 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-754pt\" (UniqueName: \"kubernetes.io/projected/ca1779b4-8945-4667-b086-b7481edf1099-kube-api-access-754pt\") pod \"auto-csr-approver-29551704-8n96k\" (UID: \"ca1779b4-8945-4667-b086-b7481edf1099\") " pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.492206 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.915558 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551704-8n96k"] Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.106600 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551704-8n96k" event={"ID":"ca1779b4-8945-4667-b086-b7481edf1099","Type":"ContainerStarted","Data":"f27aa240ed8c53a1f70a3cb47c55285d93f9439410091b3f051238f53f054807"} Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.990006 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.991714 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.993537 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.994248 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.994745 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.997063 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.017123 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.175542 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-system-configs\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.175654 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildcachedir\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.175701 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glhr2\" (UniqueName: \"kubernetes.io/projected/f71b7ce5-faba-47c4-b13e-5d333f06efc3-kube-api-access-glhr2\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.175851 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176001 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176203 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-root\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176296 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176366 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176487 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildworkdir\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176602 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-run\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176835 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-pull\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176931 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-push\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.287680 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-pull\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.287791 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-push\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.287852 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-system-configs\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.287961 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildcachedir\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288022 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glhr2\" (UniqueName: \"kubernetes.io/projected/f71b7ce5-faba-47c4-b13e-5d333f06efc3-kube-api-access-glhr2\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288100 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288145 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288210 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288256 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-root\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288306 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildcachedir\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288943 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290002 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-system-configs\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288318 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290133 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290174 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildworkdir\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290170 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-root\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290261 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290271 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-run\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290430 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildworkdir\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.291658 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:04 crc kubenswrapper[4994]: I0310 00:24:04.293812 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-run\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:04 crc kubenswrapper[4994]: I0310 00:24:04.294758 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-push\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:04 crc kubenswrapper[4994]: I0310 00:24:04.294792 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-pull\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:04 crc kubenswrapper[4994]: I0310 00:24:04.306664 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glhr2\" (UniqueName: \"kubernetes.io/projected/f71b7ce5-faba-47c4-b13e-5d333f06efc3-kube-api-access-glhr2\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:04 crc kubenswrapper[4994]: I0310 00:24:04.410806 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 10 00:24:04 crc kubenswrapper[4994]: I0310 00:24:04.632609 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:24:04 crc kubenswrapper[4994]: W0310 00:24:04.642157 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf71b7ce5_faba_47c4_b13e_5d333f06efc3.slice/crio-4cfefb7397487a02866f52632158f74eea10ce28127274c9486d96660b536555 WatchSource:0}: Error finding container 4cfefb7397487a02866f52632158f74eea10ce28127274c9486d96660b536555: Status 404 returned error can't find the container with id 4cfefb7397487a02866f52632158f74eea10ce28127274c9486d96660b536555 Mar 10 00:24:05 crc kubenswrapper[4994]: I0310 00:24:05.142413 4994 generic.go:334] "Generic (PLEG): container finished" podID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerID="3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36" exitCode=0 Mar 10 00:24:05 crc kubenswrapper[4994]: I0310 00:24:05.142492 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f71b7ce5-faba-47c4-b13e-5d333f06efc3","Type":"ContainerDied","Data":"3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36"} Mar 10 00:24:05 crc kubenswrapper[4994]: I0310 00:24:05.143076 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f71b7ce5-faba-47c4-b13e-5d333f06efc3","Type":"ContainerStarted","Data":"4cfefb7397487a02866f52632158f74eea10ce28127274c9486d96660b536555"} Mar 10 00:24:05 crc kubenswrapper[4994]: I0310 00:24:05.148835 4994 generic.go:334] "Generic (PLEG): container finished" podID="ca1779b4-8945-4667-b086-b7481edf1099" containerID="05dbb86f2a8b07de1b18fc5d17c0892e037b64a657609562e3eb766699973201" exitCode=0 Mar 10 00:24:05 crc kubenswrapper[4994]: I0310 00:24:05.149103 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551704-8n96k" event={"ID":"ca1779b4-8945-4667-b086-b7481edf1099","Type":"ContainerDied","Data":"05dbb86f2a8b07de1b18fc5d17c0892e037b64a657609562e3eb766699973201"} Mar 10 00:24:06 crc kubenswrapper[4994]: I0310 00:24:06.163555 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f71b7ce5-faba-47c4-b13e-5d333f06efc3","Type":"ContainerStarted","Data":"5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16"} Mar 10 00:24:06 crc kubenswrapper[4994]: I0310 00:24:06.197541 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=5.19751942 podStartE2EDuration="5.19751942s" podCreationTimestamp="2026-03-10 00:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:24:06.192634898 +0000 UTC m=+1060.366341667" watchObservedRunningTime="2026-03-10 00:24:06.19751942 +0000 UTC m=+1060.371226169" Mar 10 00:24:06 crc kubenswrapper[4994]: I0310 00:24:06.427575 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:06 crc kubenswrapper[4994]: I0310 00:24:06.456002 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-754pt\" (UniqueName: \"kubernetes.io/projected/ca1779b4-8945-4667-b086-b7481edf1099-kube-api-access-754pt\") pod \"ca1779b4-8945-4667-b086-b7481edf1099\" (UID: \"ca1779b4-8945-4667-b086-b7481edf1099\") " Mar 10 00:24:06 crc kubenswrapper[4994]: I0310 00:24:06.464088 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1779b4-8945-4667-b086-b7481edf1099-kube-api-access-754pt" (OuterVolumeSpecName: "kube-api-access-754pt") pod "ca1779b4-8945-4667-b086-b7481edf1099" (UID: "ca1779b4-8945-4667-b086-b7481edf1099"). InnerVolumeSpecName "kube-api-access-754pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:24:06 crc kubenswrapper[4994]: I0310 00:24:06.557149 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-754pt\" (UniqueName: \"kubernetes.io/projected/ca1779b4-8945-4667-b086-b7481edf1099-kube-api-access-754pt\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:07 crc kubenswrapper[4994]: I0310 00:24:07.174331 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551704-8n96k" event={"ID":"ca1779b4-8945-4667-b086-b7481edf1099","Type":"ContainerDied","Data":"f27aa240ed8c53a1f70a3cb47c55285d93f9439410091b3f051238f53f054807"} Mar 10 00:24:07 crc kubenswrapper[4994]: I0310 00:24:07.174384 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:07 crc kubenswrapper[4994]: I0310 00:24:07.174413 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27aa240ed8c53a1f70a3cb47c55285d93f9439410091b3f051238f53f054807" Mar 10 00:24:07 crc kubenswrapper[4994]: I0310 00:24:07.502917 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551698-2d6g5"] Mar 10 00:24:07 crc kubenswrapper[4994]: I0310 00:24:07.512360 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551698-2d6g5"] Mar 10 00:24:08 crc kubenswrapper[4994]: I0310 00:24:08.570300 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6471fd89-1c92-498d-ba15-149418259c58" path="/var/lib/kubelet/pods/6471fd89-1c92-498d-ba15-149418259c58/volumes" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.331984 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.332474 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerName="docker-build" containerID="cri-o://5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16" gracePeriod=30 Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.747775 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_f71b7ce5-faba-47c4-b13e-5d333f06efc3/docker-build/0.log" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.748677 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.846463 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildworkdir\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.846508 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-pull\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.846540 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-proxy-ca-bundles\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.846569 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-node-pullsecrets\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.846586 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-ca-bundles\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.846602 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-push\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.847210 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.847337 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.847534 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.847614 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.852158 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.852587 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948008 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-system-configs\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948087 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-root\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948163 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glhr2\" (UniqueName: \"kubernetes.io/projected/f71b7ce5-faba-47c4-b13e-5d333f06efc3-kube-api-access-glhr2\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948191 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-blob-cache\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948211 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildcachedir\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948237 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-run\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948827 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948869 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948892 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948952 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948939 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948971 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.949046 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.949408 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.949863 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.957470 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71b7ce5-faba-47c4-b13e-5d333f06efc3-kube-api-access-glhr2" (OuterVolumeSpecName: "kube-api-access-glhr2") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "kube-api-access-glhr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.038198 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.049846 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.049872 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.049881 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.049891 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.049899 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glhr2\" (UniqueName: \"kubernetes.io/projected/f71b7ce5-faba-47c4-b13e-5d333f06efc3-kube-api-access-glhr2\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.088595 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.151142 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.222490 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_f71b7ce5-faba-47c4-b13e-5d333f06efc3/docker-build/0.log" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.222996 4994 generic.go:334] "Generic (PLEG): container finished" podID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerID="5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16" exitCode=1 Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.223047 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f71b7ce5-faba-47c4-b13e-5d333f06efc3","Type":"ContainerDied","Data":"5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16"} Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.223085 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f71b7ce5-faba-47c4-b13e-5d333f06efc3","Type":"ContainerDied","Data":"4cfefb7397487a02866f52632158f74eea10ce28127274c9486d96660b536555"} Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.223109 4994 scope.go:117] "RemoveContainer" containerID="5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.223200 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.271736 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.273525 4994 scope.go:117] "RemoveContainer" containerID="3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.278875 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.303124 4994 scope.go:117] "RemoveContainer" containerID="5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16" Mar 10 00:24:13 crc kubenswrapper[4994]: E0310 00:24:13.303739 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16\": container with ID starting with 5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16 not found: ID does not exist" containerID="5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.303787 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16"} err="failed to get container status \"5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16\": rpc error: code = NotFound desc = could not find container \"5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16\": container with ID starting with 5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16 not found: ID does not exist" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.303818 4994 scope.go:117] "RemoveContainer" containerID="3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36" Mar 10 00:24:13 crc kubenswrapper[4994]: E0310 00:24:13.304179 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36\": container with ID starting with 3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36 not found: ID does not exist" containerID="3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.304200 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36"} err="failed to get container status \"3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36\": rpc error: code = NotFound desc = could not find container \"3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36\": container with ID starting with 3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36 not found: ID does not exist" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.027092 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 10 00:24:14 crc kubenswrapper[4994]: E0310 00:24:14.027490 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerName="docker-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.027517 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerName="docker-build" Mar 10 00:24:14 crc kubenswrapper[4994]: E0310 00:24:14.027539 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerName="manage-dockerfile" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.027553 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerName="manage-dockerfile" Mar 10 00:24:14 crc kubenswrapper[4994]: E0310 00:24:14.027570 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1779b4-8945-4667-b086-b7481edf1099" containerName="oc" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.027582 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1779b4-8945-4667-b086-b7481edf1099" containerName="oc" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.027794 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerName="docker-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.027836 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1779b4-8945-4667-b086-b7481edf1099" containerName="oc" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.029454 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.032138 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.032859 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.033274 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.033535 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.045010 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069365 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-run\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069424 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-buildworkdir\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069450 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-root\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069530 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-system-configs\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069635 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069746 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-buildcachedir\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069944 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-push\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.070050 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mgnf\" (UniqueName: \"kubernetes.io/projected/45ce1682-4d9a-4e51-ae93-a1832751d811-kube-api-access-7mgnf\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.070106 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-pull\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.070221 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.070263 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.070294 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172070 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172327 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172534 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-buildcachedir\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172483 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-buildcachedir\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172605 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-push\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172643 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mgnf\" (UniqueName: \"kubernetes.io/projected/45ce1682-4d9a-4e51-ae93-a1832751d811-kube-api-access-7mgnf\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172673 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-pull\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172727 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172759 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172786 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172832 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-run\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172859 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-buildworkdir\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172890 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-system-configs\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172942 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-root\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.173274 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.173651 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-root\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.173736 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-run\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.174104 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.174152 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-buildworkdir\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.174268 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-system-configs\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.174369 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.176716 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-pull\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.176816 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-push\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.194510 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mgnf\" (UniqueName: \"kubernetes.io/projected/45ce1682-4d9a-4e51-ae93-a1832751d811-kube-api-access-7mgnf\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.359944 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.564663 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" path="/var/lib/kubelet/pods/f71b7ce5-faba-47c4-b13e-5d333f06efc3/volumes" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.668476 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 10 00:24:15 crc kubenswrapper[4994]: I0310 00:24:15.241008 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerStarted","Data":"b38fd394341d0bca8c978c6208bc322f1303868e9303842247913a0129a0ba66"} Mar 10 00:24:15 crc kubenswrapper[4994]: I0310 00:24:15.241362 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerStarted","Data":"4cd2b54ef27177ce7bd06c1a0c62cb99ddedbaaaa4b7415ca746e6d6a53589fe"} Mar 10 00:24:16 crc kubenswrapper[4994]: I0310 00:24:16.250682 4994 generic.go:334] "Generic (PLEG): container finished" podID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerID="b38fd394341d0bca8c978c6208bc322f1303868e9303842247913a0129a0ba66" exitCode=0 Mar 10 00:24:16 crc kubenswrapper[4994]: I0310 00:24:16.250881 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerDied","Data":"b38fd394341d0bca8c978c6208bc322f1303868e9303842247913a0129a0ba66"} Mar 10 00:24:18 crc kubenswrapper[4994]: I0310 00:24:18.893074 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:24:18 crc kubenswrapper[4994]: I0310 00:24:18.893900 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:24:21 crc kubenswrapper[4994]: I0310 00:24:21.475015 4994 scope.go:117] "RemoveContainer" containerID="b02eba80da92e156810a460cd2fd2b2fbae8ce74141ff71d34b4fc6b8bc7db3f" Mar 10 00:24:23 crc kubenswrapper[4994]: I0310 00:24:23.317818 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerStarted","Data":"b7709b48ec49d7397d6227aa1d44c39adb7449f43816e0b86669024bb3c14d3c"} Mar 10 00:24:24 crc kubenswrapper[4994]: I0310 00:24:24.328182 4994 generic.go:334] "Generic (PLEG): container finished" podID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerID="b7709b48ec49d7397d6227aa1d44c39adb7449f43816e0b86669024bb3c14d3c" exitCode=0 Mar 10 00:24:24 crc kubenswrapper[4994]: I0310 00:24:24.328240 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerDied","Data":"b7709b48ec49d7397d6227aa1d44c39adb7449f43816e0b86669024bb3c14d3c"} Mar 10 00:24:25 crc kubenswrapper[4994]: I0310 00:24:25.337049 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerStarted","Data":"6966590713cea1dd54724874068e8b954718fd9eea114677b3e7a90c5394c8c0"} Mar 10 00:24:25 crc kubenswrapper[4994]: I0310 00:24:25.366629 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=12.366609381 podStartE2EDuration="12.366609381s" podCreationTimestamp="2026-03-10 00:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:24:25.364391881 +0000 UTC m=+1079.538098630" watchObservedRunningTime="2026-03-10 00:24:25.366609381 +0000 UTC m=+1079.540316130" Mar 10 00:24:48 crc kubenswrapper[4994]: I0310 00:24:48.892287 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:24:48 crc kubenswrapper[4994]: I0310 00:24:48.892953 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.102607 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xmg88"] Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.104286 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.122432 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xmg88"] Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.226383 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p55nx\" (UniqueName: \"kubernetes.io/projected/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-kube-api-access-p55nx\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.226440 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-utilities\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.226498 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-catalog-content\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.327591 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-catalog-content\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.327722 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p55nx\" (UniqueName: \"kubernetes.io/projected/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-kube-api-access-p55nx\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.327744 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-utilities\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.328187 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-catalog-content\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.328257 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-utilities\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.345815 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p55nx\" (UniqueName: \"kubernetes.io/projected/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-kube-api-access-p55nx\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.423865 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.647789 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xmg88"] Mar 10 00:25:06 crc kubenswrapper[4994]: I0310 00:25:06.625057 4994 generic.go:334] "Generic (PLEG): container finished" podID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerID="2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259" exitCode=0 Mar 10 00:25:06 crc kubenswrapper[4994]: I0310 00:25:06.625147 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerDied","Data":"2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259"} Mar 10 00:25:06 crc kubenswrapper[4994]: I0310 00:25:06.625374 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerStarted","Data":"35da12795b727384764d62c7284523d0731f85372bc44799ea418bef94b3a2c0"} Mar 10 00:25:09 crc kubenswrapper[4994]: I0310 00:25:09.644417 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerStarted","Data":"3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f"} Mar 10 00:25:11 crc kubenswrapper[4994]: I0310 00:25:11.661295 4994 generic.go:334] "Generic (PLEG): container finished" podID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerID="3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f" exitCode=0 Mar 10 00:25:11 crc kubenswrapper[4994]: I0310 00:25:11.661370 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerDied","Data":"3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f"} Mar 10 00:25:12 crc kubenswrapper[4994]: I0310 00:25:12.672659 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerStarted","Data":"e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313"} Mar 10 00:25:12 crc kubenswrapper[4994]: I0310 00:25:12.701261 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xmg88" podStartSLOduration=2.167660839 podStartE2EDuration="7.701227965s" podCreationTimestamp="2026-03-10 00:25:05 +0000 UTC" firstStartedPulling="2026-03-10 00:25:06.626711989 +0000 UTC m=+1120.800418738" lastFinishedPulling="2026-03-10 00:25:12.160279075 +0000 UTC m=+1126.333985864" observedRunningTime="2026-03-10 00:25:12.692547512 +0000 UTC m=+1126.866254351" watchObservedRunningTime="2026-03-10 00:25:12.701227965 +0000 UTC m=+1126.874934754" Mar 10 00:25:15 crc kubenswrapper[4994]: I0310 00:25:15.424350 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:15 crc kubenswrapper[4994]: I0310 00:25:15.424433 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:16 crc kubenswrapper[4994]: I0310 00:25:16.462526 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xmg88" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="registry-server" probeResult="failure" output=< Mar 10 00:25:16 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:25:16 crc kubenswrapper[4994]: > Mar 10 00:25:18 crc kubenswrapper[4994]: I0310 00:25:18.892984 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:25:18 crc kubenswrapper[4994]: I0310 00:25:18.894136 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:25:18 crc kubenswrapper[4994]: I0310 00:25:18.894251 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:25:18 crc kubenswrapper[4994]: I0310 00:25:18.895262 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4df6fccd58598d7ef69c17858b0dc84c63fc5b8f887c3889c08c1cb8b68f120e"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:25:18 crc kubenswrapper[4994]: I0310 00:25:18.895369 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://4df6fccd58598d7ef69c17858b0dc84c63fc5b8f887c3889c08c1cb8b68f120e" gracePeriod=600 Mar 10 00:25:19 crc kubenswrapper[4994]: I0310 00:25:19.734708 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"4df6fccd58598d7ef69c17858b0dc84c63fc5b8f887c3889c08c1cb8b68f120e"} Mar 10 00:25:19 crc kubenswrapper[4994]: I0310 00:25:19.735184 4994 scope.go:117] "RemoveContainer" containerID="226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db" Mar 10 00:25:19 crc kubenswrapper[4994]: I0310 00:25:19.736227 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="4df6fccd58598d7ef69c17858b0dc84c63fc5b8f887c3889c08c1cb8b68f120e" exitCode=0 Mar 10 00:25:20 crc kubenswrapper[4994]: I0310 00:25:20.748456 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"779b11783e082c837efecec96b026cd3be87293636b7184dfd3efe1ae146c491"} Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.219336 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wb9zw"] Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.221456 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.226107 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wb9zw"] Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.318015 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg5hc\" (UniqueName: \"kubernetes.io/projected/a3025eba-bc72-4841-8d11-e07912a08204-kube-api-access-xg5hc\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.318399 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-catalog-content\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.318476 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-utilities\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.420076 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-utilities\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.420164 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg5hc\" (UniqueName: \"kubernetes.io/projected/a3025eba-bc72-4841-8d11-e07912a08204-kube-api-access-xg5hc\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.420185 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-catalog-content\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.420739 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-utilities\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.420763 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-catalog-content\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.450691 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg5hc\" (UniqueName: \"kubernetes.io/projected/a3025eba-bc72-4841-8d11-e07912a08204-kube-api-access-xg5hc\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.467807 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.505338 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.550238 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.828048 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wb9zw"] Mar 10 00:25:26 crc kubenswrapper[4994]: I0310 00:25:26.804190 4994 generic.go:334] "Generic (PLEG): container finished" podID="a3025eba-bc72-4841-8d11-e07912a08204" containerID="3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4" exitCode=0 Mar 10 00:25:26 crc kubenswrapper[4994]: I0310 00:25:26.804283 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerDied","Data":"3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4"} Mar 10 00:25:26 crc kubenswrapper[4994]: I0310 00:25:26.804640 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerStarted","Data":"a6eab4ce936f87d49ced6dfd2bb1e60e9d0032bca1642da7e1ee18e34523c380"} Mar 10 00:25:26 crc kubenswrapper[4994]: I0310 00:25:26.806597 4994 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:25:27 crc kubenswrapper[4994]: I0310 00:25:27.796942 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xmg88"] Mar 10 00:25:27 crc kubenswrapper[4994]: I0310 00:25:27.797532 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xmg88" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="registry-server" containerID="cri-o://e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313" gracePeriod=2 Mar 10 00:25:27 crc kubenswrapper[4994]: I0310 00:25:27.810375 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerStarted","Data":"07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8"} Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.191887 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.275650 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p55nx\" (UniqueName: \"kubernetes.io/projected/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-kube-api-access-p55nx\") pod \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.275719 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-catalog-content\") pod \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.275791 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-utilities\") pod \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.276705 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-utilities" (OuterVolumeSpecName: "utilities") pod "6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" (UID: "6cedd32c-f8fb-4e43-b82e-57c0ae8d384a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.288511 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-kube-api-access-p55nx" (OuterVolumeSpecName: "kube-api-access-p55nx") pod "6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" (UID: "6cedd32c-f8fb-4e43-b82e-57c0ae8d384a"). InnerVolumeSpecName "kube-api-access-p55nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.377946 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p55nx\" (UniqueName: \"kubernetes.io/projected/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-kube-api-access-p55nx\") on node \"crc\" DevicePath \"\"" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.377992 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.444516 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" (UID: "6cedd32c-f8fb-4e43-b82e-57c0ae8d384a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.479828 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.817237 4994 generic.go:334] "Generic (PLEG): container finished" podID="a3025eba-bc72-4841-8d11-e07912a08204" containerID="07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8" exitCode=0 Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.817300 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerDied","Data":"07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8"} Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.820768 4994 generic.go:334] "Generic (PLEG): container finished" podID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerID="e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313" exitCode=0 Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.820792 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerDied","Data":"e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313"} Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.820811 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerDied","Data":"35da12795b727384764d62c7284523d0731f85372bc44799ea418bef94b3a2c0"} Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.820813 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.820861 4994 scope.go:117] "RemoveContainer" containerID="e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.837161 4994 scope.go:117] "RemoveContainer" containerID="3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.842263 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xmg88"] Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.848850 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xmg88"] Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.865772 4994 scope.go:117] "RemoveContainer" containerID="2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.898991 4994 scope.go:117] "RemoveContainer" containerID="e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313" Mar 10 00:25:28 crc kubenswrapper[4994]: E0310 00:25:28.899564 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313\": container with ID starting with e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313 not found: ID does not exist" containerID="e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.899701 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313"} err="failed to get container status \"e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313\": rpc error: code = NotFound desc = could not find container \"e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313\": container with ID starting with e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313 not found: ID does not exist" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.899794 4994 scope.go:117] "RemoveContainer" containerID="3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f" Mar 10 00:25:28 crc kubenswrapper[4994]: E0310 00:25:28.901815 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f\": container with ID starting with 3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f not found: ID does not exist" containerID="3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.901924 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f"} err="failed to get container status \"3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f\": rpc error: code = NotFound desc = could not find container \"3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f\": container with ID starting with 3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f not found: ID does not exist" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.901992 4994 scope.go:117] "RemoveContainer" containerID="2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259" Mar 10 00:25:28 crc kubenswrapper[4994]: E0310 00:25:28.905130 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259\": container with ID starting with 2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259 not found: ID does not exist" containerID="2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.905232 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259"} err="failed to get container status \"2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259\": rpc error: code = NotFound desc = could not find container \"2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259\": container with ID starting with 2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259 not found: ID does not exist" Mar 10 00:25:29 crc kubenswrapper[4994]: I0310 00:25:29.828665 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerStarted","Data":"5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994"} Mar 10 00:25:29 crc kubenswrapper[4994]: I0310 00:25:29.849975 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wb9zw" podStartSLOduration=2.349913155 podStartE2EDuration="4.849954514s" podCreationTimestamp="2026-03-10 00:25:25 +0000 UTC" firstStartedPulling="2026-03-10 00:25:26.806323274 +0000 UTC m=+1140.980030013" lastFinishedPulling="2026-03-10 00:25:29.306364623 +0000 UTC m=+1143.480071372" observedRunningTime="2026-03-10 00:25:29.848383631 +0000 UTC m=+1144.022090440" watchObservedRunningTime="2026-03-10 00:25:29.849954514 +0000 UTC m=+1144.023661273" Mar 10 00:25:30 crc kubenswrapper[4994]: I0310 00:25:30.560591 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" path="/var/lib/kubelet/pods/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a/volumes" Mar 10 00:25:35 crc kubenswrapper[4994]: I0310 00:25:35.550772 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:35 crc kubenswrapper[4994]: I0310 00:25:35.551254 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:35 crc kubenswrapper[4994]: I0310 00:25:35.625055 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:35 crc kubenswrapper[4994]: I0310 00:25:35.924855 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:36 crc kubenswrapper[4994]: I0310 00:25:36.310031 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wb9zw"] Mar 10 00:25:37 crc kubenswrapper[4994]: I0310 00:25:37.891618 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wb9zw" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="registry-server" containerID="cri-o://5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994" gracePeriod=2 Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.342610 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.440671 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg5hc\" (UniqueName: \"kubernetes.io/projected/a3025eba-bc72-4841-8d11-e07912a08204-kube-api-access-xg5hc\") pod \"a3025eba-bc72-4841-8d11-e07912a08204\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.440905 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-catalog-content\") pod \"a3025eba-bc72-4841-8d11-e07912a08204\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.441028 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-utilities\") pod \"a3025eba-bc72-4841-8d11-e07912a08204\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.442280 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-utilities" (OuterVolumeSpecName: "utilities") pod "a3025eba-bc72-4841-8d11-e07912a08204" (UID: "a3025eba-bc72-4841-8d11-e07912a08204"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.452197 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3025eba-bc72-4841-8d11-e07912a08204-kube-api-access-xg5hc" (OuterVolumeSpecName: "kube-api-access-xg5hc") pod "a3025eba-bc72-4841-8d11-e07912a08204" (UID: "a3025eba-bc72-4841-8d11-e07912a08204"). InnerVolumeSpecName "kube-api-access-xg5hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.543384 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg5hc\" (UniqueName: \"kubernetes.io/projected/a3025eba-bc72-4841-8d11-e07912a08204-kube-api-access-xg5hc\") on node \"crc\" DevicePath \"\"" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.543430 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.924963 4994 generic.go:334] "Generic (PLEG): container finished" podID="a3025eba-bc72-4841-8d11-e07912a08204" containerID="5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994" exitCode=0 Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.925070 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerDied","Data":"5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994"} Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.925161 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerDied","Data":"a6eab4ce936f87d49ced6dfd2bb1e60e9d0032bca1642da7e1ee18e34523c380"} Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.925153 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.925198 4994 scope.go:117] "RemoveContainer" containerID="5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.947824 4994 scope.go:117] "RemoveContainer" containerID="07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.973803 4994 scope.go:117] "RemoveContainer" containerID="3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.022838 4994 scope.go:117] "RemoveContainer" containerID="5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994" Mar 10 00:25:39 crc kubenswrapper[4994]: E0310 00:25:39.023896 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994\": container with ID starting with 5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994 not found: ID does not exist" containerID="5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.023967 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994"} err="failed to get container status \"5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994\": rpc error: code = NotFound desc = could not find container \"5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994\": container with ID starting with 5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994 not found: ID does not exist" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.024015 4994 scope.go:117] "RemoveContainer" containerID="07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8" Mar 10 00:25:39 crc kubenswrapper[4994]: E0310 00:25:39.025338 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8\": container with ID starting with 07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8 not found: ID does not exist" containerID="07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.025374 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8"} err="failed to get container status \"07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8\": rpc error: code = NotFound desc = could not find container \"07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8\": container with ID starting with 07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8 not found: ID does not exist" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.025399 4994 scope.go:117] "RemoveContainer" containerID="3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4" Mar 10 00:25:39 crc kubenswrapper[4994]: E0310 00:25:39.025843 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4\": container with ID starting with 3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4 not found: ID does not exist" containerID="3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.025863 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4"} err="failed to get container status \"3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4\": rpc error: code = NotFound desc = could not find container \"3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4\": container with ID starting with 3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4 not found: ID does not exist" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.272330 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3025eba-bc72-4841-8d11-e07912a08204" (UID: "a3025eba-bc72-4841-8d11-e07912a08204"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.355366 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.582946 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wb9zw"] Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.597258 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wb9zw"] Mar 10 00:25:40 crc kubenswrapper[4994]: I0310 00:25:40.561896 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3025eba-bc72-4841-8d11-e07912a08204" path="/var/lib/kubelet/pods/a3025eba-bc72-4841-8d11-e07912a08204/volumes" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130044 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551706-trj4h"] Mar 10 00:26:00 crc kubenswrapper[4994]: E0310 00:26:00.130649 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="registry-server" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130660 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="registry-server" Mar 10 00:26:00 crc kubenswrapper[4994]: E0310 00:26:00.130673 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="extract-utilities" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130679 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="extract-utilities" Mar 10 00:26:00 crc kubenswrapper[4994]: E0310 00:26:00.130690 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="extract-utilities" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130697 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="extract-utilities" Mar 10 00:26:00 crc kubenswrapper[4994]: E0310 00:26:00.130705 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="registry-server" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130711 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="registry-server" Mar 10 00:26:00 crc kubenswrapper[4994]: E0310 00:26:00.130724 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="extract-content" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130729 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="extract-content" Mar 10 00:26:00 crc kubenswrapper[4994]: E0310 00:26:00.130739 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="extract-content" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130745 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="extract-content" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130845 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="registry-server" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130858 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="registry-server" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.131243 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.134395 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.135092 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.141591 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.144646 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551706-trj4h"] Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.201353 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7thp\" (UniqueName: \"kubernetes.io/projected/c045a416-3fda-4dc3-b95a-15be10565d84-kube-api-access-p7thp\") pod \"auto-csr-approver-29551706-trj4h\" (UID: \"c045a416-3fda-4dc3-b95a-15be10565d84\") " pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.302544 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7thp\" (UniqueName: \"kubernetes.io/projected/c045a416-3fda-4dc3-b95a-15be10565d84-kube-api-access-p7thp\") pod \"auto-csr-approver-29551706-trj4h\" (UID: \"c045a416-3fda-4dc3-b95a-15be10565d84\") " pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.325756 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7thp\" (UniqueName: \"kubernetes.io/projected/c045a416-3fda-4dc3-b95a-15be10565d84-kube-api-access-p7thp\") pod \"auto-csr-approver-29551706-trj4h\" (UID: \"c045a416-3fda-4dc3-b95a-15be10565d84\") " pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.449694 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.877170 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551706-trj4h"] Mar 10 00:26:01 crc kubenswrapper[4994]: I0310 00:26:01.106861 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551706-trj4h" event={"ID":"c045a416-3fda-4dc3-b95a-15be10565d84","Type":"ContainerStarted","Data":"74f169ce0d92034fbe5865a3ef8ad30cc13e7968079cd4710eae07c024c594a6"} Mar 10 00:26:02 crc kubenswrapper[4994]: I0310 00:26:02.114379 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551706-trj4h" event={"ID":"c045a416-3fda-4dc3-b95a-15be10565d84","Type":"ContainerStarted","Data":"4533fb5052fb646d2bdd6d148243818a0541d9f71db00a897459732577383e18"} Mar 10 00:26:02 crc kubenswrapper[4994]: I0310 00:26:02.128342 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551706-trj4h" podStartSLOduration=1.242171514 podStartE2EDuration="2.128320753s" podCreationTimestamp="2026-03-10 00:26:00 +0000 UTC" firstStartedPulling="2026-03-10 00:26:00.888638372 +0000 UTC m=+1175.062345131" lastFinishedPulling="2026-03-10 00:26:01.774787621 +0000 UTC m=+1175.948494370" observedRunningTime="2026-03-10 00:26:02.127000808 +0000 UTC m=+1176.300707557" watchObservedRunningTime="2026-03-10 00:26:02.128320753 +0000 UTC m=+1176.302027512" Mar 10 00:26:03 crc kubenswrapper[4994]: I0310 00:26:03.123087 4994 generic.go:334] "Generic (PLEG): container finished" podID="c045a416-3fda-4dc3-b95a-15be10565d84" containerID="4533fb5052fb646d2bdd6d148243818a0541d9f71db00a897459732577383e18" exitCode=0 Mar 10 00:26:03 crc kubenswrapper[4994]: I0310 00:26:03.123142 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551706-trj4h" event={"ID":"c045a416-3fda-4dc3-b95a-15be10565d84","Type":"ContainerDied","Data":"4533fb5052fb646d2bdd6d148243818a0541d9f71db00a897459732577383e18"} Mar 10 00:26:04 crc kubenswrapper[4994]: I0310 00:26:04.437492 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:04 crc kubenswrapper[4994]: I0310 00:26:04.555448 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7thp\" (UniqueName: \"kubernetes.io/projected/c045a416-3fda-4dc3-b95a-15be10565d84-kube-api-access-p7thp\") pod \"c045a416-3fda-4dc3-b95a-15be10565d84\" (UID: \"c045a416-3fda-4dc3-b95a-15be10565d84\") " Mar 10 00:26:04 crc kubenswrapper[4994]: I0310 00:26:04.573055 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c045a416-3fda-4dc3-b95a-15be10565d84-kube-api-access-p7thp" (OuterVolumeSpecName: "kube-api-access-p7thp") pod "c045a416-3fda-4dc3-b95a-15be10565d84" (UID: "c045a416-3fda-4dc3-b95a-15be10565d84"). InnerVolumeSpecName "kube-api-access-p7thp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:26:04 crc kubenswrapper[4994]: I0310 00:26:04.657001 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7thp\" (UniqueName: \"kubernetes.io/projected/c045a416-3fda-4dc3-b95a-15be10565d84-kube-api-access-p7thp\") on node \"crc\" DevicePath \"\"" Mar 10 00:26:05 crc kubenswrapper[4994]: I0310 00:26:05.139059 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551706-trj4h" event={"ID":"c045a416-3fda-4dc3-b95a-15be10565d84","Type":"ContainerDied","Data":"74f169ce0d92034fbe5865a3ef8ad30cc13e7968079cd4710eae07c024c594a6"} Mar 10 00:26:05 crc kubenswrapper[4994]: I0310 00:26:05.139103 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74f169ce0d92034fbe5865a3ef8ad30cc13e7968079cd4710eae07c024c594a6" Mar 10 00:26:05 crc kubenswrapper[4994]: I0310 00:26:05.139162 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:05 crc kubenswrapper[4994]: I0310 00:26:05.208310 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551700-9pnx5"] Mar 10 00:26:05 crc kubenswrapper[4994]: I0310 00:26:05.217434 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551700-9pnx5"] Mar 10 00:26:06 crc kubenswrapper[4994]: I0310 00:26:06.562536 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e07bbe9c-f27a-4256-82ba-3adc771e2ebd" path="/var/lib/kubelet/pods/e07bbe9c-f27a-4256-82ba-3adc771e2ebd/volumes" Mar 10 00:26:21 crc kubenswrapper[4994]: I0310 00:26:21.575184 4994 scope.go:117] "RemoveContainer" containerID="7f305cfa821f31b484905a2d361cdfc46f777a5744001baccc6a559f30eb2409" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.572272 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dlw24"] Mar 10 00:27:23 crc kubenswrapper[4994]: E0310 00:27:23.573227 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c045a416-3fda-4dc3-b95a-15be10565d84" containerName="oc" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.573248 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c045a416-3fda-4dc3-b95a-15be10565d84" containerName="oc" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.573437 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="c045a416-3fda-4dc3-b95a-15be10565d84" containerName="oc" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.574745 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.595649 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlw24"] Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.698442 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-utilities\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.698548 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmvcl\" (UniqueName: \"kubernetes.io/projected/10b2465a-61c4-4f13-9649-04138927dd46-kube-api-access-cmvcl\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.699236 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-catalog-content\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.800913 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmvcl\" (UniqueName: \"kubernetes.io/projected/10b2465a-61c4-4f13-9649-04138927dd46-kube-api-access-cmvcl\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.801084 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-catalog-content\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.801177 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-utilities\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.801681 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-utilities\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.801765 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-catalog-content\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.820218 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmvcl\" (UniqueName: \"kubernetes.io/projected/10b2465a-61c4-4f13-9649-04138927dd46-kube-api-access-cmvcl\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.897001 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:24 crc kubenswrapper[4994]: I0310 00:27:24.414087 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlw24"] Mar 10 00:27:24 crc kubenswrapper[4994]: I0310 00:27:24.744223 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerStarted","Data":"ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4"} Mar 10 00:27:24 crc kubenswrapper[4994]: I0310 00:27:24.744465 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerStarted","Data":"a6ebce8a9cc9b04bd5530ce4f4161d48c32326fbdbfa53b02c09cc0f94ae4154"} Mar 10 00:27:25 crc kubenswrapper[4994]: I0310 00:27:25.755806 4994 generic.go:334] "Generic (PLEG): container finished" podID="10b2465a-61c4-4f13-9649-04138927dd46" containerID="ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4" exitCode=0 Mar 10 00:27:25 crc kubenswrapper[4994]: I0310 00:27:25.755865 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerDied","Data":"ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4"} Mar 10 00:27:27 crc kubenswrapper[4994]: I0310 00:27:27.772140 4994 generic.go:334] "Generic (PLEG): container finished" podID="10b2465a-61c4-4f13-9649-04138927dd46" containerID="c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff" exitCode=0 Mar 10 00:27:27 crc kubenswrapper[4994]: I0310 00:27:27.772296 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerDied","Data":"c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff"} Mar 10 00:27:30 crc kubenswrapper[4994]: I0310 00:27:30.796345 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerStarted","Data":"b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141"} Mar 10 00:27:30 crc kubenswrapper[4994]: I0310 00:27:30.820691 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dlw24" podStartSLOduration=3.983302392 podStartE2EDuration="7.82065899s" podCreationTimestamp="2026-03-10 00:27:23 +0000 UTC" firstStartedPulling="2026-03-10 00:27:25.75875283 +0000 UTC m=+1259.932459619" lastFinishedPulling="2026-03-10 00:27:29.596109458 +0000 UTC m=+1263.769816217" observedRunningTime="2026-03-10 00:27:30.818288762 +0000 UTC m=+1264.991995531" watchObservedRunningTime="2026-03-10 00:27:30.82065899 +0000 UTC m=+1264.994365779" Mar 10 00:27:33 crc kubenswrapper[4994]: I0310 00:27:33.898605 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:33 crc kubenswrapper[4994]: I0310 00:27:33.898912 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:34 crc kubenswrapper[4994]: I0310 00:27:34.941647 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dlw24" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="registry-server" probeResult="failure" output=< Mar 10 00:27:34 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:27:34 crc kubenswrapper[4994]: > Mar 10 00:27:43 crc kubenswrapper[4994]: I0310 00:27:43.975987 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:44 crc kubenswrapper[4994]: I0310 00:27:44.031162 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:44 crc kubenswrapper[4994]: I0310 00:27:44.219435 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlw24"] Mar 10 00:27:45 crc kubenswrapper[4994]: I0310 00:27:45.906991 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dlw24" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="registry-server" containerID="cri-o://b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141" gracePeriod=2 Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.336319 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.432322 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-catalog-content\") pod \"10b2465a-61c4-4f13-9649-04138927dd46\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.432434 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmvcl\" (UniqueName: \"kubernetes.io/projected/10b2465a-61c4-4f13-9649-04138927dd46-kube-api-access-cmvcl\") pod \"10b2465a-61c4-4f13-9649-04138927dd46\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.432480 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-utilities\") pod \"10b2465a-61c4-4f13-9649-04138927dd46\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.433606 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-utilities" (OuterVolumeSpecName: "utilities") pod "10b2465a-61c4-4f13-9649-04138927dd46" (UID: "10b2465a-61c4-4f13-9649-04138927dd46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.437130 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b2465a-61c4-4f13-9649-04138927dd46-kube-api-access-cmvcl" (OuterVolumeSpecName: "kube-api-access-cmvcl") pod "10b2465a-61c4-4f13-9649-04138927dd46" (UID: "10b2465a-61c4-4f13-9649-04138927dd46"). InnerVolumeSpecName "kube-api-access-cmvcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.483491 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10b2465a-61c4-4f13-9649-04138927dd46" (UID: "10b2465a-61c4-4f13-9649-04138927dd46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.534545 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.534586 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmvcl\" (UniqueName: \"kubernetes.io/projected/10b2465a-61c4-4f13-9649-04138927dd46-kube-api-access-cmvcl\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.534598 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.914977 4994 generic.go:334] "Generic (PLEG): container finished" podID="10b2465a-61c4-4f13-9649-04138927dd46" containerID="b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141" exitCode=0 Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.915065 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.915072 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerDied","Data":"b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141"} Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.915344 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerDied","Data":"a6ebce8a9cc9b04bd5530ce4f4161d48c32326fbdbfa53b02c09cc0f94ae4154"} Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.915363 4994 scope.go:117] "RemoveContainer" containerID="b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.932417 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlw24"] Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.934912 4994 scope.go:117] "RemoveContainer" containerID="c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.938070 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dlw24"] Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.952789 4994 scope.go:117] "RemoveContainer" containerID="ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.977400 4994 scope.go:117] "RemoveContainer" containerID="b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141" Mar 10 00:27:46 crc kubenswrapper[4994]: E0310 00:27:46.977839 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141\": container with ID starting with b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141 not found: ID does not exist" containerID="b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.977893 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141"} err="failed to get container status \"b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141\": rpc error: code = NotFound desc = could not find container \"b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141\": container with ID starting with b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141 not found: ID does not exist" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.977920 4994 scope.go:117] "RemoveContainer" containerID="c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff" Mar 10 00:27:46 crc kubenswrapper[4994]: E0310 00:27:46.978341 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff\": container with ID starting with c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff not found: ID does not exist" containerID="c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.978369 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff"} err="failed to get container status \"c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff\": rpc error: code = NotFound desc = could not find container \"c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff\": container with ID starting with c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff not found: ID does not exist" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.978394 4994 scope.go:117] "RemoveContainer" containerID="ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4" Mar 10 00:27:46 crc kubenswrapper[4994]: E0310 00:27:46.978791 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4\": container with ID starting with ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4 not found: ID does not exist" containerID="ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.978815 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4"} err="failed to get container status \"ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4\": rpc error: code = NotFound desc = could not find container \"ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4\": container with ID starting with ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4 not found: ID does not exist" Mar 10 00:27:48 crc kubenswrapper[4994]: I0310 00:27:48.565338 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10b2465a-61c4-4f13-9649-04138927dd46" path="/var/lib/kubelet/pods/10b2465a-61c4-4f13-9649-04138927dd46/volumes" Mar 10 00:27:48 crc kubenswrapper[4994]: I0310 00:27:48.892667 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:27:48 crc kubenswrapper[4994]: I0310 00:27:48.892758 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:27:57 crc kubenswrapper[4994]: I0310 00:27:57.008590 4994 generic.go:334] "Generic (PLEG): container finished" podID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerID="6966590713cea1dd54724874068e8b954718fd9eea114677b3e7a90c5394c8c0" exitCode=0 Mar 10 00:27:57 crc kubenswrapper[4994]: I0310 00:27:57.008718 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerDied","Data":"6966590713cea1dd54724874068e8b954718fd9eea114677b3e7a90c5394c8c0"} Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.294831 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493200 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-node-pullsecrets\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493245 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-buildcachedir\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493308 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-pull\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493365 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493381 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493434 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-root\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493474 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mgnf\" (UniqueName: \"kubernetes.io/projected/45ce1682-4d9a-4e51-ae93-a1832751d811-kube-api-access-7mgnf\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493503 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-build-blob-cache\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493521 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-proxy-ca-bundles\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.494626 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-run\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.494699 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-system-configs\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.494767 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-push\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.494797 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-ca-bundles\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.494822 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-buildworkdir\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.494239 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.495666 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.496533 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.501457 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ce1682-4d9a-4e51-ae93-a1832751d811-kube-api-access-7mgnf" (OuterVolumeSpecName: "kube-api-access-7mgnf") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "kube-api-access-7mgnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.502095 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.503900 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.506276 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.506313 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.506328 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.506349 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.506362 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.514784 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.520412 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.607154 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.607191 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.607203 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.607216 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.607228 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mgnf\" (UniqueName: \"kubernetes.io/projected/45ce1682-4d9a-4e51-ae93-a1832751d811-kube-api-access-7mgnf\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:59 crc kubenswrapper[4994]: I0310 00:27:59.029509 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerDied","Data":"4cd2b54ef27177ce7bd06c1a0c62cb99ddedbaaaa4b7415ca746e6d6a53589fe"} Mar 10 00:27:59 crc kubenswrapper[4994]: I0310 00:27:59.029588 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd2b54ef27177ce7bd06c1a0c62cb99ddedbaaaa4b7415ca746e6d6a53589fe" Mar 10 00:27:59 crc kubenswrapper[4994]: I0310 00:27:59.029657 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.163464 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551708-mcgcl"] Mar 10 00:28:00 crc kubenswrapper[4994]: E0310 00:28:00.164227 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="docker-build" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164249 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="docker-build" Mar 10 00:28:00 crc kubenswrapper[4994]: E0310 00:28:00.164266 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="extract-content" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164277 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="extract-content" Mar 10 00:28:00 crc kubenswrapper[4994]: E0310 00:28:00.164291 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="git-clone" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164301 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="git-clone" Mar 10 00:28:00 crc kubenswrapper[4994]: E0310 00:28:00.164319 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="manage-dockerfile" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164331 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="manage-dockerfile" Mar 10 00:28:00 crc kubenswrapper[4994]: E0310 00:28:00.164346 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="registry-server" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164356 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="registry-server" Mar 10 00:28:00 crc kubenswrapper[4994]: E0310 00:28:00.164376 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="extract-utilities" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164386 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="extract-utilities" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164568 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="registry-server" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164594 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="docker-build" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.165263 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.168175 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.168345 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.168690 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.173851 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551708-mcgcl"] Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.247325 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.257812 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n85sf\" (UniqueName: \"kubernetes.io/projected/79b6ae72-9c1a-4191-84af-d06b0155e244-kube-api-access-n85sf\") pod \"auto-csr-approver-29551708-mcgcl\" (UID: \"79b6ae72-9c1a-4191-84af-d06b0155e244\") " pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.258015 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.359518 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n85sf\" (UniqueName: \"kubernetes.io/projected/79b6ae72-9c1a-4191-84af-d06b0155e244-kube-api-access-n85sf\") pod \"auto-csr-approver-29551708-mcgcl\" (UID: \"79b6ae72-9c1a-4191-84af-d06b0155e244\") " pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.382822 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n85sf\" (UniqueName: \"kubernetes.io/projected/79b6ae72-9c1a-4191-84af-d06b0155e244-kube-api-access-n85sf\") pod \"auto-csr-approver-29551708-mcgcl\" (UID: \"79b6ae72-9c1a-4191-84af-d06b0155e244\") " pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.494173 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.773863 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551708-mcgcl"] Mar 10 00:28:00 crc kubenswrapper[4994]: W0310 00:28:00.777428 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79b6ae72_9c1a_4191_84af_d06b0155e244.slice/crio-77987b2d3a6178f38d6d4b99656280ec7ef9e91c4c82ee1cea49141ef3c6115a WatchSource:0}: Error finding container 77987b2d3a6178f38d6d4b99656280ec7ef9e91c4c82ee1cea49141ef3c6115a: Status 404 returned error can't find the container with id 77987b2d3a6178f38d6d4b99656280ec7ef9e91c4c82ee1cea49141ef3c6115a Mar 10 00:28:01 crc kubenswrapper[4994]: I0310 00:28:01.045275 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" event={"ID":"79b6ae72-9c1a-4191-84af-d06b0155e244","Type":"ContainerStarted","Data":"77987b2d3a6178f38d6d4b99656280ec7ef9e91c4c82ee1cea49141ef3c6115a"} Mar 10 00:28:01 crc kubenswrapper[4994]: I0310 00:28:01.883150 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:01 crc kubenswrapper[4994]: I0310 00:28:01.896968 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.529214 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.531955 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.542137 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.543371 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.545834 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.559124 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.577331 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604561 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j86pn\" (UniqueName: \"kubernetes.io/projected/febc2c12-11b4-423b-98f7-043a38b945e3-kube-api-access-j86pn\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604614 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-push\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604638 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604660 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604679 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604801 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604944 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604987 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.605027 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.605078 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.605111 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.605138 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-pull\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.706970 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707026 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707057 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707078 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-pull\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707132 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j86pn\" (UniqueName: \"kubernetes.io/projected/febc2c12-11b4-423b-98f7-043a38b945e3-kube-api-access-j86pn\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707155 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-push\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707180 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707200 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707220 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707250 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707289 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707340 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707637 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707902 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.708215 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.708434 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.708713 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.708769 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.709283 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.709325 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.709932 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.713631 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-push\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.713664 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-pull\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.738436 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j86pn\" (UniqueName: \"kubernetes.io/projected/febc2c12-11b4-423b-98f7-043a38b945e3-kube-api-access-j86pn\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.855731 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:03 crc kubenswrapper[4994]: I0310 00:28:03.072793 4994 generic.go:334] "Generic (PLEG): container finished" podID="79b6ae72-9c1a-4191-84af-d06b0155e244" containerID="76e4652e3cbbbc7dd950c967558f6551927fe404fc62cca88c145579e5829da9" exitCode=0 Mar 10 00:28:03 crc kubenswrapper[4994]: I0310 00:28:03.072903 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" event={"ID":"79b6ae72-9c1a-4191-84af-d06b0155e244","Type":"ContainerDied","Data":"76e4652e3cbbbc7dd950c967558f6551927fe404fc62cca88c145579e5829da9"} Mar 10 00:28:03 crc kubenswrapper[4994]: I0310 00:28:03.144191 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:28:03 crc kubenswrapper[4994]: W0310 00:28:03.152047 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfebc2c12_11b4_423b_98f7_043a38b945e3.slice/crio-bf0cd1c02ee41e3997d72ac822c22536731285fae7a54bbc141b19e514666ea9 WatchSource:0}: Error finding container bf0cd1c02ee41e3997d72ac822c22536731285fae7a54bbc141b19e514666ea9: Status 404 returned error can't find the container with id bf0cd1c02ee41e3997d72ac822c22536731285fae7a54bbc141b19e514666ea9 Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.086411 4994 generic.go:334] "Generic (PLEG): container finished" podID="febc2c12-11b4-423b-98f7-043a38b945e3" containerID="a27ea216bf9458a64549c80f1db46605b4b76efa82db45780a652582c7511391" exitCode=0 Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.086523 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"febc2c12-11b4-423b-98f7-043a38b945e3","Type":"ContainerDied","Data":"a27ea216bf9458a64549c80f1db46605b4b76efa82db45780a652582c7511391"} Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.086596 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"febc2c12-11b4-423b-98f7-043a38b945e3","Type":"ContainerStarted","Data":"bf0cd1c02ee41e3997d72ac822c22536731285fae7a54bbc141b19e514666ea9"} Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.437223 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.535635 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n85sf\" (UniqueName: \"kubernetes.io/projected/79b6ae72-9c1a-4191-84af-d06b0155e244-kube-api-access-n85sf\") pod \"79b6ae72-9c1a-4191-84af-d06b0155e244\" (UID: \"79b6ae72-9c1a-4191-84af-d06b0155e244\") " Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.542310 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b6ae72-9c1a-4191-84af-d06b0155e244-kube-api-access-n85sf" (OuterVolumeSpecName: "kube-api-access-n85sf") pod "79b6ae72-9c1a-4191-84af-d06b0155e244" (UID: "79b6ae72-9c1a-4191-84af-d06b0155e244"). InnerVolumeSpecName "kube-api-access-n85sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.637124 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n85sf\" (UniqueName: \"kubernetes.io/projected/79b6ae72-9c1a-4191-84af-d06b0155e244-kube-api-access-n85sf\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.096309 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.096272 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" event={"ID":"79b6ae72-9c1a-4191-84af-d06b0155e244","Type":"ContainerDied","Data":"77987b2d3a6178f38d6d4b99656280ec7ef9e91c4c82ee1cea49141ef3c6115a"} Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.097793 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77987b2d3a6178f38d6d4b99656280ec7ef9e91c4c82ee1cea49141ef3c6115a" Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.099156 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"febc2c12-11b4-423b-98f7-043a38b945e3","Type":"ContainerStarted","Data":"925f54c347107dfef1d6a968040c02d90a520675d87da19db00e9ab9a077bd4f"} Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.136287 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=3.136265613 podStartE2EDuration="3.136265613s" podCreationTimestamp="2026-03-10 00:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:28:05.134232283 +0000 UTC m=+1299.307939052" watchObservedRunningTime="2026-03-10 00:28:05.136265613 +0000 UTC m=+1299.309972372" Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.510119 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551702-qz8hb"] Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.523319 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551702-qz8hb"] Mar 10 00:28:06 crc kubenswrapper[4994]: I0310 00:28:06.564614 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dbd271f-5f29-4221-bfbe-2274ce440c29" path="/var/lib/kubelet/pods/0dbd271f-5f29-4221-bfbe-2274ce440c29/volumes" Mar 10 00:28:12 crc kubenswrapper[4994]: I0310 00:28:12.853510 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:28:12 crc kubenswrapper[4994]: I0310 00:28:12.854270 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-bridge-1-build" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" containerName="docker-build" containerID="cri-o://925f54c347107dfef1d6a968040c02d90a520675d87da19db00e9ab9a077bd4f" gracePeriod=30 Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.157542 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_febc2c12-11b4-423b-98f7-043a38b945e3/docker-build/0.log" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.158273 4994 generic.go:334] "Generic (PLEG): container finished" podID="febc2c12-11b4-423b-98f7-043a38b945e3" containerID="925f54c347107dfef1d6a968040c02d90a520675d87da19db00e9ab9a077bd4f" exitCode=1 Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.158318 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"febc2c12-11b4-423b-98f7-043a38b945e3","Type":"ContainerDied","Data":"925f54c347107dfef1d6a968040c02d90a520675d87da19db00e9ab9a077bd4f"} Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.352921 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_febc2c12-11b4-423b-98f7-043a38b945e3/docker-build/0.log" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.353513 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.468782 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-node-pullsecrets\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.468900 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-ca-bundles\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.468954 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-buildworkdir\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.468970 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.468992 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-root\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469144 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-pull\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469207 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j86pn\" (UniqueName: \"kubernetes.io/projected/febc2c12-11b4-423b-98f7-043a38b945e3-kube-api-access-j86pn\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469252 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-buildcachedir\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469354 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-run\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469405 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-proxy-ca-bundles\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469443 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-system-configs\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469477 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-build-blob-cache\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469527 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-push\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.470063 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.470103 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.471224 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.471297 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.471609 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.472584 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.481079 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/febc2c12-11b4-423b-98f7-043a38b945e3-kube-api-access-j86pn" (OuterVolumeSpecName: "kube-api-access-j86pn") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "kube-api-access-j86pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.481109 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.481170 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.471720 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571443 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571489 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571506 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571522 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571539 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571555 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571570 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571586 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j86pn\" (UniqueName: \"kubernetes.io/projected/febc2c12-11b4-423b-98f7-043a38b945e3-kube-api-access-j86pn\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571603 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.578825 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.673503 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.888925 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.977303 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.166982 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_febc2c12-11b4-423b-98f7-043a38b945e3/docker-build/0.log" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.167615 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"febc2c12-11b4-423b-98f7-043a38b945e3","Type":"ContainerDied","Data":"bf0cd1c02ee41e3997d72ac822c22536731285fae7a54bbc141b19e514666ea9"} Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.167687 4994 scope.go:117] "RemoveContainer" containerID="925f54c347107dfef1d6a968040c02d90a520675d87da19db00e9ab9a077bd4f" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.167739 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.191906 4994 scope.go:117] "RemoveContainer" containerID="a27ea216bf9458a64549c80f1db46605b4b76efa82db45780a652582c7511391" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.215660 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.221435 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.478553 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 10 00:28:14 crc kubenswrapper[4994]: E0310 00:28:14.478762 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" containerName="manage-dockerfile" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.478773 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" containerName="manage-dockerfile" Mar 10 00:28:14 crc kubenswrapper[4994]: E0310 00:28:14.478784 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b6ae72-9c1a-4191-84af-d06b0155e244" containerName="oc" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.478790 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b6ae72-9c1a-4191-84af-d06b0155e244" containerName="oc" Mar 10 00:28:14 crc kubenswrapper[4994]: E0310 00:28:14.478798 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" containerName="docker-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.478804 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" containerName="docker-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.478911 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" containerName="docker-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.478927 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b6ae72-9c1a-4191-84af-d06b0155e244" containerName="oc" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.479683 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.481480 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.481829 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.482291 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.484397 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.484930 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.484965 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485029 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485086 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485110 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485210 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485243 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-pull\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485266 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485323 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485364 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlxdw\" (UniqueName: \"kubernetes.io/projected/45372411-b93c-4485-8fea-d6802d98592f-kube-api-access-zlxdw\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485390 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-push\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485433 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.503713 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.562333 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" path="/var/lib/kubelet/pods/febc2c12-11b4-423b-98f7-043a38b945e3/volumes" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587006 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587043 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-pull\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587059 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587087 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587131 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlxdw\" (UniqueName: \"kubernetes.io/projected/45372411-b93c-4485-8fea-d6802d98592f-kube-api-access-zlxdw\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587155 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-push\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587182 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587222 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587247 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587266 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587288 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587310 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587666 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587691 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587762 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587965 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.588000 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.588382 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.588695 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.589007 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.590225 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.591598 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-push\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.598685 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-pull\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.604486 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlxdw\" (UniqueName: \"kubernetes.io/projected/45372411-b93c-4485-8fea-d6802d98592f-kube-api-access-zlxdw\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.794397 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:15 crc kubenswrapper[4994]: I0310 00:28:15.066089 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 10 00:28:15 crc kubenswrapper[4994]: I0310 00:28:15.176292 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerStarted","Data":"ac5cbcff64507a3b220a97a744b7b6014bcc97611b25e2ef088c6ab089cef2c0"} Mar 10 00:28:16 crc kubenswrapper[4994]: I0310 00:28:16.190384 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerStarted","Data":"848724b6dc463e3e8e91c8b36236fbf49ccdd2d3dd3583be8782337ddb94d9eb"} Mar 10 00:28:17 crc kubenswrapper[4994]: I0310 00:28:17.200074 4994 generic.go:334] "Generic (PLEG): container finished" podID="45372411-b93c-4485-8fea-d6802d98592f" containerID="848724b6dc463e3e8e91c8b36236fbf49ccdd2d3dd3583be8782337ddb94d9eb" exitCode=0 Mar 10 00:28:17 crc kubenswrapper[4994]: I0310 00:28:17.200155 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerDied","Data":"848724b6dc463e3e8e91c8b36236fbf49ccdd2d3dd3583be8782337ddb94d9eb"} Mar 10 00:28:18 crc kubenswrapper[4994]: I0310 00:28:18.210566 4994 generic.go:334] "Generic (PLEG): container finished" podID="45372411-b93c-4485-8fea-d6802d98592f" containerID="257cd17c481a0d71313837e0519c20670a47ba4ec6bde6881a97a4383e647ea5" exitCode=0 Mar 10 00:28:18 crc kubenswrapper[4994]: I0310 00:28:18.210645 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerDied","Data":"257cd17c481a0d71313837e0519c20670a47ba4ec6bde6881a97a4383e647ea5"} Mar 10 00:28:18 crc kubenswrapper[4994]: I0310 00:28:18.277847 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_45372411-b93c-4485-8fea-d6802d98592f/manage-dockerfile/0.log" Mar 10 00:28:18 crc kubenswrapper[4994]: I0310 00:28:18.893198 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:28:18 crc kubenswrapper[4994]: I0310 00:28:18.893680 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:28:19 crc kubenswrapper[4994]: I0310 00:28:19.223108 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerStarted","Data":"e48b2d6a05a6181bd819f2462e0e5a97bbf8486f23ce25be1a9818712e37ac69"} Mar 10 00:28:19 crc kubenswrapper[4994]: I0310 00:28:19.270715 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=5.270684798 podStartE2EDuration="5.270684798s" podCreationTimestamp="2026-03-10 00:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:28:19.266813363 +0000 UTC m=+1313.440520162" watchObservedRunningTime="2026-03-10 00:28:19.270684798 +0000 UTC m=+1313.444391577" Mar 10 00:28:21 crc kubenswrapper[4994]: I0310 00:28:21.662254 4994 scope.go:117] "RemoveContainer" containerID="579778a9ca15bde63cc28eb094f98e7025e92dc67fe3a2f215a9a028c2966910" Mar 10 00:28:48 crc kubenswrapper[4994]: I0310 00:28:48.892392 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:28:48 crc kubenswrapper[4994]: I0310 00:28:48.893128 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:28:48 crc kubenswrapper[4994]: I0310 00:28:48.893188 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:28:48 crc kubenswrapper[4994]: I0310 00:28:48.894141 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"779b11783e082c837efecec96b026cd3be87293636b7184dfd3efe1ae146c491"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:28:48 crc kubenswrapper[4994]: I0310 00:28:48.894234 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://779b11783e082c837efecec96b026cd3be87293636b7184dfd3efe1ae146c491" gracePeriod=600 Mar 10 00:28:49 crc kubenswrapper[4994]: I0310 00:28:49.484526 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="779b11783e082c837efecec96b026cd3be87293636b7184dfd3efe1ae146c491" exitCode=0 Mar 10 00:28:49 crc kubenswrapper[4994]: I0310 00:28:49.484603 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"779b11783e082c837efecec96b026cd3be87293636b7184dfd3efe1ae146c491"} Mar 10 00:28:49 crc kubenswrapper[4994]: I0310 00:28:49.485144 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"d8d935625d60ec1fe79acd428aa0c427cb2a184ba8e0a37f25ea8bb9485e5629"} Mar 10 00:28:49 crc kubenswrapper[4994]: I0310 00:28:49.485188 4994 scope.go:117] "RemoveContainer" containerID="4df6fccd58598d7ef69c17858b0dc84c63fc5b8f887c3889c08c1cb8b68f120e" Mar 10 00:29:05 crc kubenswrapper[4994]: I0310 00:29:05.635627 4994 generic.go:334] "Generic (PLEG): container finished" podID="45372411-b93c-4485-8fea-d6802d98592f" containerID="e48b2d6a05a6181bd819f2462e0e5a97bbf8486f23ce25be1a9818712e37ac69" exitCode=0 Mar 10 00:29:05 crc kubenswrapper[4994]: I0310 00:29:05.635702 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerDied","Data":"e48b2d6a05a6181bd819f2462e0e5a97bbf8486f23ce25be1a9818712e37ac69"} Mar 10 00:29:06 crc kubenswrapper[4994]: I0310 00:29:06.957655 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127320 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-buildcachedir\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127449 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-node-pullsecrets\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127503 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-build-blob-cache\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127588 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-system-configs\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127654 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-ca-bundles\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127732 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-root\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127784 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-pull\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127854 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlxdw\" (UniqueName: \"kubernetes.io/projected/45372411-b93c-4485-8fea-d6802d98592f-kube-api-access-zlxdw\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130140 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-buildworkdir\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130207 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-proxy-ca-bundles\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.128142 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130267 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-push\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130325 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-run\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.128189 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.129570 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.129967 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130844 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130865 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130911 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130932 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.131284 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.131620 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.136165 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45372411-b93c-4485-8fea-d6802d98592f-kube-api-access-zlxdw" (OuterVolumeSpecName: "kube-api-access-zlxdw") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "kube-api-access-zlxdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.136369 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.136429 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.232986 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.233033 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlxdw\" (UniqueName: \"kubernetes.io/projected/45372411-b93c-4485-8fea-d6802d98592f-kube-api-access-zlxdw\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.233052 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.233071 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.233089 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.431270 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.436411 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.655252 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerDied","Data":"ac5cbcff64507a3b220a97a744b7b6014bcc97611b25e2ef088c6ab089cef2c0"} Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.655328 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac5cbcff64507a3b220a97a744b7b6014bcc97611b25e2ef088c6ab089cef2c0" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.655368 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 10 00:29:09 crc kubenswrapper[4994]: I0310 00:29:09.799038 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:09 crc kubenswrapper[4994]: I0310 00:29:09.873799 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:09 crc kubenswrapper[4994]: I0310 00:29:09.952296 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:09 crc kubenswrapper[4994]: I0310 00:29:09.975234 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.303098 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:29:11 crc kubenswrapper[4994]: E0310 00:29:11.303674 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="docker-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.303689 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="docker-build" Mar 10 00:29:11 crc kubenswrapper[4994]: E0310 00:29:11.303705 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="manage-dockerfile" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.303714 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="manage-dockerfile" Mar 10 00:29:11 crc kubenswrapper[4994]: E0310 00:29:11.303726 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="git-clone" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.303734 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="git-clone" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.303900 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="docker-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.304600 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.307339 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.307469 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.307761 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.308945 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.331258 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398361 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398425 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398458 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398480 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398519 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn7kb\" (UniqueName: \"kubernetes.io/projected/5673610d-32a0-4fe8-974d-e919df0dc6aa-kube-api-access-mn7kb\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398546 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398571 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398646 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398695 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398729 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398847 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398983 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500508 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500580 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500638 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500664 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500692 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500716 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500800 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500861 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500944 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn7kb\" (UniqueName: \"kubernetes.io/projected/5673610d-32a0-4fe8-974d-e919df0dc6aa-kube-api-access-mn7kb\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501332 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501365 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501379 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501428 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501465 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501507 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501552 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501599 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501813 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501837 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.502143 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.502773 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.509626 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.515028 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.522365 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn7kb\" (UniqueName: \"kubernetes.io/projected/5673610d-32a0-4fe8-974d-e919df0dc6aa-kube-api-access-mn7kb\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.628260 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:12 crc kubenswrapper[4994]: I0310 00:29:12.120276 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:29:12 crc kubenswrapper[4994]: I0310 00:29:12.695663 4994 generic.go:334] "Generic (PLEG): container finished" podID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerID="8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362" exitCode=0 Mar 10 00:29:12 crc kubenswrapper[4994]: I0310 00:29:12.695927 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"5673610d-32a0-4fe8-974d-e919df0dc6aa","Type":"ContainerDied","Data":"8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362"} Mar 10 00:29:12 crc kubenswrapper[4994]: I0310 00:29:12.695958 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"5673610d-32a0-4fe8-974d-e919df0dc6aa","Type":"ContainerStarted","Data":"b4cf3f288f48594f960e028ba9e5003a8d1167d0c12c4cd7e1d79c3242ec5dda"} Mar 10 00:29:13 crc kubenswrapper[4994]: I0310 00:29:13.708939 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"5673610d-32a0-4fe8-974d-e919df0dc6aa","Type":"ContainerStarted","Data":"0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be"} Mar 10 00:29:13 crc kubenswrapper[4994]: I0310 00:29:13.744937 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=2.744905829 podStartE2EDuration="2.744905829s" podCreationTimestamp="2026-03-10 00:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:29:13.743462862 +0000 UTC m=+1367.917169681" watchObservedRunningTime="2026-03-10 00:29:13.744905829 +0000 UTC m=+1367.918612618" Mar 10 00:29:21 crc kubenswrapper[4994]: I0310 00:29:21.831417 4994 scope.go:117] "RemoveContainer" containerID="eb388fa374116b603771455669ec292f8665b546692a48ed5cd70354b73fe681" Mar 10 00:29:21 crc kubenswrapper[4994]: I0310 00:29:21.879067 4994 scope.go:117] "RemoveContainer" containerID="5b7a38adc0c6a78554c415c32e9eb6f227d97fceb20d2f07317cb0b1675be632" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.036252 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.036734 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerName="docker-build" containerID="cri-o://0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be" gracePeriod=30 Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.445379 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_5673610d-32a0-4fe8-974d-e919df0dc6aa/docker-build/0.log" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.446172 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584395 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildcachedir\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584470 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-proxy-ca-bundles\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584541 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-system-configs\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584542 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584579 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-run\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584641 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildworkdir\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584691 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-root\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584753 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn7kb\" (UniqueName: \"kubernetes.io/projected/5673610d-32a0-4fe8-974d-e919df0dc6aa-kube-api-access-mn7kb\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584797 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-push\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584863 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-blob-cache\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584942 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-ca-bundles\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584987 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-node-pullsecrets\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.585024 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-pull\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.585392 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.585776 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.585792 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.586007 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.586460 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.586640 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.588168 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.594057 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.594276 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.594591 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5673610d-32a0-4fe8-974d-e919df0dc6aa-kube-api-access-mn7kb" (OuterVolumeSpecName: "kube-api-access-mn7kb") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "kube-api-access-mn7kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.662242 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687540 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687586 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687606 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687623 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687638 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687653 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687669 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687684 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn7kb\" (UniqueName: \"kubernetes.io/projected/5673610d-32a0-4fe8-974d-e919df0dc6aa-kube-api-access-mn7kb\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687698 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687712 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.784093 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_5673610d-32a0-4fe8-974d-e919df0dc6aa/docker-build/0.log" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.784530 4994 generic.go:334] "Generic (PLEG): container finished" podID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerID="0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be" exitCode=1 Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.784587 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"5673610d-32a0-4fe8-974d-e919df0dc6aa","Type":"ContainerDied","Data":"0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be"} Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.784614 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"5673610d-32a0-4fe8-974d-e919df0dc6aa","Type":"ContainerDied","Data":"b4cf3f288f48594f960e028ba9e5003a8d1167d0c12c4cd7e1d79c3242ec5dda"} Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.784635 4994 scope.go:117] "RemoveContainer" containerID="0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.784796 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.807314 4994 scope.go:117] "RemoveContainer" containerID="8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.835969 4994 scope.go:117] "RemoveContainer" containerID="0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be" Mar 10 00:29:22 crc kubenswrapper[4994]: E0310 00:29:22.837004 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be\": container with ID starting with 0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be not found: ID does not exist" containerID="0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.837051 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be"} err="failed to get container status \"0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be\": rpc error: code = NotFound desc = could not find container \"0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be\": container with ID starting with 0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be not found: ID does not exist" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.837088 4994 scope.go:117] "RemoveContainer" containerID="8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362" Mar 10 00:29:22 crc kubenswrapper[4994]: E0310 00:29:22.837510 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362\": container with ID starting with 8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362 not found: ID does not exist" containerID="8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.837709 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362"} err="failed to get container status \"8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362\": rpc error: code = NotFound desc = could not find container \"8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362\": container with ID starting with 8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362 not found: ID does not exist" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.012535 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.095030 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.126381 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.134788 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.771618 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 10 00:29:23 crc kubenswrapper[4994]: E0310 00:29:23.772436 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerName="docker-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.772620 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerName="docker-build" Mar 10 00:29:23 crc kubenswrapper[4994]: E0310 00:29:23.772805 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerName="manage-dockerfile" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.773025 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerName="manage-dockerfile" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.773407 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerName="docker-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.775249 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.779516 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.779787 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.779862 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.780497 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.810051 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907335 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907399 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907426 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907645 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907752 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907781 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g8bl\" (UniqueName: \"kubernetes.io/projected/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-kube-api-access-7g8bl\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907803 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907826 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907901 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.908020 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.908071 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.908158 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009215 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009260 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009287 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009304 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009336 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009366 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009390 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g8bl\" (UniqueName: \"kubernetes.io/projected/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-kube-api-access-7g8bl\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009415 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009432 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009461 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009504 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009537 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009775 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.010064 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.010106 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.010417 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.010576 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.010679 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.010686 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.011033 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.011218 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.014307 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.015072 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.028640 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g8bl\" (UniqueName: \"kubernetes.io/projected/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-kube-api-access-7g8bl\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.100449 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.337751 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.564635 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" path="/var/lib/kubelet/pods/5673610d-32a0-4fe8-974d-e919df0dc6aa/volumes" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.804676 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerStarted","Data":"fbe6d9cc5efceee105fad5a2e933a33bc5f5468a2f7d89006b830968fada061c"} Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.804721 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerStarted","Data":"e73b4ce11c2c82beb195c98c6499aa1914f9f2f2e89ddf242434359b3b2d56d3"} Mar 10 00:29:25 crc kubenswrapper[4994]: I0310 00:29:25.813293 4994 generic.go:334] "Generic (PLEG): container finished" podID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerID="fbe6d9cc5efceee105fad5a2e933a33bc5f5468a2f7d89006b830968fada061c" exitCode=0 Mar 10 00:29:25 crc kubenswrapper[4994]: I0310 00:29:25.813353 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerDied","Data":"fbe6d9cc5efceee105fad5a2e933a33bc5f5468a2f7d89006b830968fada061c"} Mar 10 00:29:26 crc kubenswrapper[4994]: I0310 00:29:26.823529 4994 generic.go:334] "Generic (PLEG): container finished" podID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerID="8fb9b76320be4c5299fb59a812c11fb1baa5de358024e62cc8edfe3261bbeaae" exitCode=0 Mar 10 00:29:26 crc kubenswrapper[4994]: I0310 00:29:26.823622 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerDied","Data":"8fb9b76320be4c5299fb59a812c11fb1baa5de358024e62cc8edfe3261bbeaae"} Mar 10 00:29:26 crc kubenswrapper[4994]: I0310 00:29:26.881939 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_74b83c90-7b71-4ae1-99a1-9f4b3e559ee5/manage-dockerfile/0.log" Mar 10 00:29:27 crc kubenswrapper[4994]: I0310 00:29:27.835181 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerStarted","Data":"1cec55edb935643ef06816de90b5acce8ec6dca68cf8395b56b9eae6f63e0b45"} Mar 10 00:29:27 crc kubenswrapper[4994]: I0310 00:29:27.877336 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=4.877306541 podStartE2EDuration="4.877306541s" podCreationTimestamp="2026-03-10 00:29:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:29:27.871293807 +0000 UTC m=+1382.045000586" watchObservedRunningTime="2026-03-10 00:29:27.877306541 +0000 UTC m=+1382.051013330" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.158847 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551710-vrwfw"] Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.160817 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.164370 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.164546 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.164746 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.171638 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh"] Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.172677 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.177413 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.177695 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.215819 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh"] Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.226646 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551710-vrwfw"] Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.259779 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzct8\" (UniqueName: \"kubernetes.io/projected/907fae93-d4a7-46e8-9fab-3c964fcb52ab-kube-api-access-wzct8\") pod \"auto-csr-approver-29551710-vrwfw\" (UID: \"907fae93-d4a7-46e8-9fab-3c964fcb52ab\") " pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.259849 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/938a13c5-9bbf-4720-95ae-9e56e9d1f085-config-volume\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.260021 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzpmj\" (UniqueName: \"kubernetes.io/projected/938a13c5-9bbf-4720-95ae-9e56e9d1f085-kube-api-access-tzpmj\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.260070 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/938a13c5-9bbf-4720-95ae-9e56e9d1f085-secret-volume\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.361074 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzpmj\" (UniqueName: \"kubernetes.io/projected/938a13c5-9bbf-4720-95ae-9e56e9d1f085-kube-api-access-tzpmj\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.361144 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/938a13c5-9bbf-4720-95ae-9e56e9d1f085-secret-volume\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.361216 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzct8\" (UniqueName: \"kubernetes.io/projected/907fae93-d4a7-46e8-9fab-3c964fcb52ab-kube-api-access-wzct8\") pod \"auto-csr-approver-29551710-vrwfw\" (UID: \"907fae93-d4a7-46e8-9fab-3c964fcb52ab\") " pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.361243 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/938a13c5-9bbf-4720-95ae-9e56e9d1f085-config-volume\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.362703 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/938a13c5-9bbf-4720-95ae-9e56e9d1f085-config-volume\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.380986 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/938a13c5-9bbf-4720-95ae-9e56e9d1f085-secret-volume\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.383924 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzct8\" (UniqueName: \"kubernetes.io/projected/907fae93-d4a7-46e8-9fab-3c964fcb52ab-kube-api-access-wzct8\") pod \"auto-csr-approver-29551710-vrwfw\" (UID: \"907fae93-d4a7-46e8-9fab-3c964fcb52ab\") " pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.389061 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzpmj\" (UniqueName: \"kubernetes.io/projected/938a13c5-9bbf-4720-95ae-9e56e9d1f085-kube-api-access-tzpmj\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.500516 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.508954 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.795851 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551710-vrwfw"] Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.869849 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh"] Mar 10 00:30:00 crc kubenswrapper[4994]: W0310 00:30:00.874544 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod938a13c5_9bbf_4720_95ae_9e56e9d1f085.slice/crio-b53385df10843aab2ad8ee46166e5f7b08cd9244ce51883694fff2c4568ceffe WatchSource:0}: Error finding container b53385df10843aab2ad8ee46166e5f7b08cd9244ce51883694fff2c4568ceffe: Status 404 returned error can't find the container with id b53385df10843aab2ad8ee46166e5f7b08cd9244ce51883694fff2c4568ceffe Mar 10 00:30:01 crc kubenswrapper[4994]: I0310 00:30:01.127177 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" event={"ID":"907fae93-d4a7-46e8-9fab-3c964fcb52ab","Type":"ContainerStarted","Data":"4377cc881d06904302f5a9c8517a3998ed1077d10d3b6579c8372dc366e78cfb"} Mar 10 00:30:01 crc kubenswrapper[4994]: I0310 00:30:01.129244 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" event={"ID":"938a13c5-9bbf-4720-95ae-9e56e9d1f085","Type":"ContainerStarted","Data":"40ada2de530609925c6412ebabe6fcca0a00c2d47cc81793d57dac2812d96507"} Mar 10 00:30:01 crc kubenswrapper[4994]: I0310 00:30:01.129290 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" event={"ID":"938a13c5-9bbf-4720-95ae-9e56e9d1f085","Type":"ContainerStarted","Data":"b53385df10843aab2ad8ee46166e5f7b08cd9244ce51883694fff2c4568ceffe"} Mar 10 00:30:01 crc kubenswrapper[4994]: I0310 00:30:01.148836 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" podStartSLOduration=1.148807753 podStartE2EDuration="1.148807753s" podCreationTimestamp="2026-03-10 00:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:30:01.145166309 +0000 UTC m=+1415.318873118" watchObservedRunningTime="2026-03-10 00:30:01.148807753 +0000 UTC m=+1415.322514542" Mar 10 00:30:02 crc kubenswrapper[4994]: I0310 00:30:02.138623 4994 generic.go:334] "Generic (PLEG): container finished" podID="938a13c5-9bbf-4720-95ae-9e56e9d1f085" containerID="40ada2de530609925c6412ebabe6fcca0a00c2d47cc81793d57dac2812d96507" exitCode=0 Mar 10 00:30:02 crc kubenswrapper[4994]: I0310 00:30:02.138705 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" event={"ID":"938a13c5-9bbf-4720-95ae-9e56e9d1f085","Type":"ContainerDied","Data":"40ada2de530609925c6412ebabe6fcca0a00c2d47cc81793d57dac2812d96507"} Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.149617 4994 generic.go:334] "Generic (PLEG): container finished" podID="907fae93-d4a7-46e8-9fab-3c964fcb52ab" containerID="dc93dce81f7d66a840274eb6f49e057db6ba425ffe2f1cac85085352655d2af7" exitCode=0 Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.149726 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" event={"ID":"907fae93-d4a7-46e8-9fab-3c964fcb52ab","Type":"ContainerDied","Data":"dc93dce81f7d66a840274eb6f49e057db6ba425ffe2f1cac85085352655d2af7"} Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.504571 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.615108 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/938a13c5-9bbf-4720-95ae-9e56e9d1f085-secret-volume\") pod \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.615545 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/938a13c5-9bbf-4720-95ae-9e56e9d1f085-config-volume\") pod \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.615652 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzpmj\" (UniqueName: \"kubernetes.io/projected/938a13c5-9bbf-4720-95ae-9e56e9d1f085-kube-api-access-tzpmj\") pod \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.616928 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/938a13c5-9bbf-4720-95ae-9e56e9d1f085-config-volume" (OuterVolumeSpecName: "config-volume") pod "938a13c5-9bbf-4720-95ae-9e56e9d1f085" (UID: "938a13c5-9bbf-4720-95ae-9e56e9d1f085"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.636324 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/938a13c5-9bbf-4720-95ae-9e56e9d1f085-kube-api-access-tzpmj" (OuterVolumeSpecName: "kube-api-access-tzpmj") pod "938a13c5-9bbf-4720-95ae-9e56e9d1f085" (UID: "938a13c5-9bbf-4720-95ae-9e56e9d1f085"). InnerVolumeSpecName "kube-api-access-tzpmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.636386 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/938a13c5-9bbf-4720-95ae-9e56e9d1f085-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "938a13c5-9bbf-4720-95ae-9e56e9d1f085" (UID: "938a13c5-9bbf-4720-95ae-9e56e9d1f085"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.717330 4994 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/938a13c5-9bbf-4720-95ae-9e56e9d1f085-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.717386 4994 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/938a13c5-9bbf-4720-95ae-9e56e9d1f085-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.717405 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzpmj\" (UniqueName: \"kubernetes.io/projected/938a13c5-9bbf-4720-95ae-9e56e9d1f085-kube-api-access-tzpmj\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.165816 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.165854 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" event={"ID":"938a13c5-9bbf-4720-95ae-9e56e9d1f085","Type":"ContainerDied","Data":"b53385df10843aab2ad8ee46166e5f7b08cd9244ce51883694fff2c4568ceffe"} Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.165956 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b53385df10843aab2ad8ee46166e5f7b08cd9244ce51883694fff2c4568ceffe" Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.460448 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.528253 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzct8\" (UniqueName: \"kubernetes.io/projected/907fae93-d4a7-46e8-9fab-3c964fcb52ab-kube-api-access-wzct8\") pod \"907fae93-d4a7-46e8-9fab-3c964fcb52ab\" (UID: \"907fae93-d4a7-46e8-9fab-3c964fcb52ab\") " Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.532843 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907fae93-d4a7-46e8-9fab-3c964fcb52ab-kube-api-access-wzct8" (OuterVolumeSpecName: "kube-api-access-wzct8") pod "907fae93-d4a7-46e8-9fab-3c964fcb52ab" (UID: "907fae93-d4a7-46e8-9fab-3c964fcb52ab"). InnerVolumeSpecName "kube-api-access-wzct8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.629718 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzct8\" (UniqueName: \"kubernetes.io/projected/907fae93-d4a7-46e8-9fab-3c964fcb52ab-kube-api-access-wzct8\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:05 crc kubenswrapper[4994]: I0310 00:30:05.178047 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" event={"ID":"907fae93-d4a7-46e8-9fab-3c964fcb52ab","Type":"ContainerDied","Data":"4377cc881d06904302f5a9c8517a3998ed1077d10d3b6579c8372dc366e78cfb"} Mar 10 00:30:05 crc kubenswrapper[4994]: I0310 00:30:05.179202 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4377cc881d06904302f5a9c8517a3998ed1077d10d3b6579c8372dc366e78cfb" Mar 10 00:30:05 crc kubenswrapper[4994]: I0310 00:30:05.178125 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:05 crc kubenswrapper[4994]: I0310 00:30:05.526366 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551704-8n96k"] Mar 10 00:30:05 crc kubenswrapper[4994]: I0310 00:30:05.531915 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551704-8n96k"] Mar 10 00:30:06 crc kubenswrapper[4994]: I0310 00:30:06.562749 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1779b4-8945-4667-b086-b7481edf1099" path="/var/lib/kubelet/pods/ca1779b4-8945-4667-b086-b7481edf1099/volumes" Mar 10 00:30:21 crc kubenswrapper[4994]: I0310 00:30:21.305392 4994 generic.go:334] "Generic (PLEG): container finished" podID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerID="1cec55edb935643ef06816de90b5acce8ec6dca68cf8395b56b9eae6f63e0b45" exitCode=0 Mar 10 00:30:21 crc kubenswrapper[4994]: I0310 00:30:21.305505 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerDied","Data":"1cec55edb935643ef06816de90b5acce8ec6dca68cf8395b56b9eae6f63e0b45"} Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.088218 4994 scope.go:117] "RemoveContainer" containerID="05dbb86f2a8b07de1b18fc5d17c0892e037b64a657609562e3eb766699973201" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.648732 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792156 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-push\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792260 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-blob-cache\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792291 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-node-pullsecrets\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792328 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-ca-bundles\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792362 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-proxy-ca-bundles\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792392 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g8bl\" (UniqueName: \"kubernetes.io/projected/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-kube-api-access-7g8bl\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792431 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-root\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792460 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildcachedir\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792443 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792494 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-run\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792527 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-pull\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792571 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildworkdir\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792651 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-system-configs\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792838 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.793864 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.793926 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.794375 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.794550 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.795121 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.795965 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.798345 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.800659 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-kube-api-access-7g8bl" (OuterVolumeSpecName: "kube-api-access-7g8bl") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "kube-api-access-7g8bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.800957 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.801122 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.886320 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.894975 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895024 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895044 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895068 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g8bl\" (UniqueName: \"kubernetes.io/projected/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-kube-api-access-7g8bl\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895087 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895107 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895127 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895145 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895163 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:23 crc kubenswrapper[4994]: I0310 00:30:23.331359 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerDied","Data":"e73b4ce11c2c82beb195c98c6499aa1914f9f2f2e89ddf242434359b3b2d56d3"} Mar 10 00:30:23 crc kubenswrapper[4994]: I0310 00:30:23.331422 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e73b4ce11c2c82beb195c98c6499aa1914f9f2f2e89ddf242434359b3b2d56d3" Mar 10 00:30:23 crc kubenswrapper[4994]: I0310 00:30:23.331423 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:30:23 crc kubenswrapper[4994]: I0310 00:30:23.759763 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:23 crc kubenswrapper[4994]: I0310 00:30:23.808299 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.983779 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:30:31 crc kubenswrapper[4994]: E0310 00:30:31.984588 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="docker-build" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984604 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="docker-build" Mar 10 00:30:31 crc kubenswrapper[4994]: E0310 00:30:31.984628 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907fae93-d4a7-46e8-9fab-3c964fcb52ab" containerName="oc" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984636 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="907fae93-d4a7-46e8-9fab-3c964fcb52ab" containerName="oc" Mar 10 00:30:31 crc kubenswrapper[4994]: E0310 00:30:31.984649 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938a13c5-9bbf-4720-95ae-9e56e9d1f085" containerName="collect-profiles" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984656 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="938a13c5-9bbf-4720-95ae-9e56e9d1f085" containerName="collect-profiles" Mar 10 00:30:31 crc kubenswrapper[4994]: E0310 00:30:31.984667 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="manage-dockerfile" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984673 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="manage-dockerfile" Mar 10 00:30:31 crc kubenswrapper[4994]: E0310 00:30:31.984689 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="git-clone" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984696 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="git-clone" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984831 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="938a13c5-9bbf-4720-95ae-9e56e9d1f085" containerName="collect-profiles" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984849 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="907fae93-d4a7-46e8-9fab-3c964fcb52ab" containerName="oc" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984858 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="docker-build" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.985567 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.988952 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.989969 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.990231 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.990979 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.011522 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027236 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027312 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwcj4\" (UniqueName: \"kubernetes.io/projected/2f68d229-c995-41a3-b73b-171d31d81311-kube-api-access-wwcj4\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027354 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027388 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027446 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027506 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027685 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027811 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027939 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.028006 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.028066 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.028111 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129575 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129655 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129702 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129745 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129789 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129842 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129850 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129932 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129986 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130016 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwcj4\" (UniqueName: \"kubernetes.io/projected/2f68d229-c995-41a3-b73b-171d31d81311-kube-api-access-wwcj4\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130047 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130080 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130164 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130279 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130372 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130690 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130733 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130696 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130777 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.131257 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.131764 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.136975 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.139139 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.149805 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwcj4\" (UniqueName: \"kubernetes.io/projected/2f68d229-c995-41a3-b73b-171d31d81311-kube-api-access-wwcj4\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:33 crc kubenswrapper[4994]: I0310 00:30:33.314583 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:33 crc kubenswrapper[4994]: I0310 00:30:33.795518 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:30:34 crc kubenswrapper[4994]: I0310 00:30:34.340145 4994 generic.go:334] "Generic (PLEG): container finished" podID="2f68d229-c995-41a3-b73b-171d31d81311" containerID="7225b87c2068896f2fce33d1aefcc4a4a471fea15131e40f56a1db24cbb94f3e" exitCode=0 Mar 10 00:30:34 crc kubenswrapper[4994]: I0310 00:30:34.340209 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"2f68d229-c995-41a3-b73b-171d31d81311","Type":"ContainerDied","Data":"7225b87c2068896f2fce33d1aefcc4a4a471fea15131e40f56a1db24cbb94f3e"} Mar 10 00:30:34 crc kubenswrapper[4994]: I0310 00:30:34.340499 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"2f68d229-c995-41a3-b73b-171d31d81311","Type":"ContainerStarted","Data":"b2c407df3480f0bdf28a09e039081c8f8ddbb1de11afdd6db04cccd8d962c75e"} Mar 10 00:30:35 crc kubenswrapper[4994]: I0310 00:30:35.351681 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_2f68d229-c995-41a3-b73b-171d31d81311/docker-build/0.log" Mar 10 00:30:35 crc kubenswrapper[4994]: I0310 00:30:35.354511 4994 generic.go:334] "Generic (PLEG): container finished" podID="2f68d229-c995-41a3-b73b-171d31d81311" containerID="efddc60aba5bd5a67702715a9ba5bd81e253ee925326f34bfc3e8b98fe80390e" exitCode=1 Mar 10 00:30:35 crc kubenswrapper[4994]: I0310 00:30:35.354580 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"2f68d229-c995-41a3-b73b-171d31d81311","Type":"ContainerDied","Data":"efddc60aba5bd5a67702715a9ba5bd81e253ee925326f34bfc3e8b98fe80390e"} Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.702630 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_2f68d229-c995-41a3-b73b-171d31d81311/docker-build/0.log" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.703290 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.796952 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-system-configs\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797090 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-root\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797141 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwcj4\" (UniqueName: \"kubernetes.io/projected/2f68d229-c995-41a3-b73b-171d31d81311-kube-api-access-wwcj4\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797181 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-node-pullsecrets\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797237 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-build-blob-cache\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797290 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-buildworkdir\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797361 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797933 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798096 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798332 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-proxy-ca-bundles\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798432 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798468 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-run\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798564 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-push\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798596 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-ca-bundles\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798692 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-buildcachedir\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798720 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-pull\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798851 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799085 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799277 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799343 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799365 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799380 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799392 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799430 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799755 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.801391 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.801670 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.802524 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.803592 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.805023 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f68d229-c995-41a3-b73b-171d31d81311-kube-api-access-wwcj4" (OuterVolumeSpecName: "kube-api-access-wwcj4") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "kube-api-access-wwcj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.900641 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.900674 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwcj4\" (UniqueName: \"kubernetes.io/projected/2f68d229-c995-41a3-b73b-171d31d81311-kube-api-access-wwcj4\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.900682 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.900691 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.900699 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.900707 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:37 crc kubenswrapper[4994]: I0310 00:30:37.376481 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_2f68d229-c995-41a3-b73b-171d31d81311/docker-build/0.log" Mar 10 00:30:37 crc kubenswrapper[4994]: I0310 00:30:37.377157 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"2f68d229-c995-41a3-b73b-171d31d81311","Type":"ContainerDied","Data":"b2c407df3480f0bdf28a09e039081c8f8ddbb1de11afdd6db04cccd8d962c75e"} Mar 10 00:30:37 crc kubenswrapper[4994]: I0310 00:30:37.377208 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2c407df3480f0bdf28a09e039081c8f8ddbb1de11afdd6db04cccd8d962c75e" Mar 10 00:30:37 crc kubenswrapper[4994]: I0310 00:30:37.377315 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:42 crc kubenswrapper[4994]: I0310 00:30:42.483101 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:30:42 crc kubenswrapper[4994]: I0310 00:30:42.496346 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:30:42 crc kubenswrapper[4994]: I0310 00:30:42.566866 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f68d229-c995-41a3-b73b-171d31d81311" path="/var/lib/kubelet/pods/2f68d229-c995-41a3-b73b-171d31d81311/volumes" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.100469 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 10 00:30:44 crc kubenswrapper[4994]: E0310 00:30:44.100719 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f68d229-c995-41a3-b73b-171d31d81311" containerName="docker-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.100731 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f68d229-c995-41a3-b73b-171d31d81311" containerName="docker-build" Mar 10 00:30:44 crc kubenswrapper[4994]: E0310 00:30:44.100747 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f68d229-c995-41a3-b73b-171d31d81311" containerName="manage-dockerfile" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.100755 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f68d229-c995-41a3-b73b-171d31d81311" containerName="manage-dockerfile" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.100862 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f68d229-c995-41a3-b73b-171d31d81311" containerName="docker-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.104457 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.106769 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.106829 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.110140 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.110152 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.127146 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.206530 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.206650 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.206803 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.206915 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mfl8\" (UniqueName: \"kubernetes.io/projected/7902abc7-2de3-4fde-ba8d-71694a115914-kube-api-access-6mfl8\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.206954 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.206988 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.207018 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.207166 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.207248 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.207283 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.207403 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.207495 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308617 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308693 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308722 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308770 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308801 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308830 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mfl8\" (UniqueName: \"kubernetes.io/projected/7902abc7-2de3-4fde-ba8d-71694a115914-kube-api-access-6mfl8\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308854 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308892 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308915 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308959 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308996 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.309036 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.309123 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.310003 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.310073 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.310175 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.310190 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.310613 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.310676 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.311330 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.311957 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.314634 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.316280 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.339573 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mfl8\" (UniqueName: \"kubernetes.io/projected/7902abc7-2de3-4fde-ba8d-71694a115914-kube-api-access-6mfl8\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.424570 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.739854 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 10 00:30:45 crc kubenswrapper[4994]: I0310 00:30:45.467606 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerStarted","Data":"6bdd8c21673cfca7f7ecd67cec43deb1c83c74ac59ed88ea65a343de1e5ef343"} Mar 10 00:30:45 crc kubenswrapper[4994]: I0310 00:30:45.468067 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerStarted","Data":"742b6efdf8a46db211dfd77bb0b22ee1b2782176fb1fd4661b2d9ecbddff8536"} Mar 10 00:30:45 crc kubenswrapper[4994]: E0310 00:30:45.617656 4994 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.164:53200->38.102.83.164:37473: read tcp 38.102.83.164:53200->38.102.83.164:37473: read: connection reset by peer Mar 10 00:30:46 crc kubenswrapper[4994]: I0310 00:30:46.489897 4994 generic.go:334] "Generic (PLEG): container finished" podID="7902abc7-2de3-4fde-ba8d-71694a115914" containerID="6bdd8c21673cfca7f7ecd67cec43deb1c83c74ac59ed88ea65a343de1e5ef343" exitCode=0 Mar 10 00:30:46 crc kubenswrapper[4994]: I0310 00:30:46.489944 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerDied","Data":"6bdd8c21673cfca7f7ecd67cec43deb1c83c74ac59ed88ea65a343de1e5ef343"} Mar 10 00:30:47 crc kubenswrapper[4994]: I0310 00:30:47.501621 4994 generic.go:334] "Generic (PLEG): container finished" podID="7902abc7-2de3-4fde-ba8d-71694a115914" containerID="f315dad252472048091cc7248e3f2692d6c35b04b3a27299375edace2561d87c" exitCode=0 Mar 10 00:30:47 crc kubenswrapper[4994]: I0310 00:30:47.501719 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerDied","Data":"f315dad252472048091cc7248e3f2692d6c35b04b3a27299375edace2561d87c"} Mar 10 00:30:47 crc kubenswrapper[4994]: I0310 00:30:47.555754 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_7902abc7-2de3-4fde-ba8d-71694a115914/manage-dockerfile/0.log" Mar 10 00:30:48 crc kubenswrapper[4994]: I0310 00:30:48.515996 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerStarted","Data":"8b19189a44163a87912f9861e6175641d65fd08147a2b1eecc6b0ca0ba642217"} Mar 10 00:30:48 crc kubenswrapper[4994]: I0310 00:30:48.556618 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=4.556585839 podStartE2EDuration="4.556585839s" podCreationTimestamp="2026-03-10 00:30:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:30:48.55429375 +0000 UTC m=+1462.728000499" watchObservedRunningTime="2026-03-10 00:30:48.556585839 +0000 UTC m=+1462.730292628" Mar 10 00:30:53 crc kubenswrapper[4994]: I0310 00:30:53.562487 4994 generic.go:334] "Generic (PLEG): container finished" podID="7902abc7-2de3-4fde-ba8d-71694a115914" containerID="8b19189a44163a87912f9861e6175641d65fd08147a2b1eecc6b0ca0ba642217" exitCode=0 Mar 10 00:30:53 crc kubenswrapper[4994]: I0310 00:30:53.562544 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerDied","Data":"8b19189a44163a87912f9861e6175641d65fd08147a2b1eecc6b0ca0ba642217"} Mar 10 00:30:54 crc kubenswrapper[4994]: I0310 00:30:54.935055 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.077560 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-run\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.077752 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-buildcachedir\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.077834 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mfl8\" (UniqueName: \"kubernetes.io/projected/7902abc7-2de3-4fde-ba8d-71694a115914-kube-api-access-6mfl8\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.077841 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.077981 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-buildworkdir\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078073 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-ca-bundles\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078184 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-push\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078230 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-node-pullsecrets\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078280 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-build-blob-cache\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078327 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-system-configs\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078376 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078462 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-root\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.079001 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.079080 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.079392 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.079585 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.081977 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-pull\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082066 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-proxy-ca-bundles\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082564 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082588 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082601 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082616 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082629 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082641 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.083262 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.083516 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.084795 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.084925 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7902abc7-2de3-4fde-ba8d-71694a115914-kube-api-access-6mfl8" (OuterVolumeSpecName: "kube-api-access-6mfl8") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "kube-api-access-6mfl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.085262 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.093597 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.184275 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.184315 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.184332 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.184344 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mfl8\" (UniqueName: \"kubernetes.io/projected/7902abc7-2de3-4fde-ba8d-71694a115914-kube-api-access-6mfl8\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.184357 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.184371 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.583328 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerDied","Data":"742b6efdf8a46db211dfd77bb0b22ee1b2782176fb1fd4661b2d9ecbddff8536"} Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.583390 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="742b6efdf8a46db211dfd77bb0b22ee1b2782176fb1fd4661b2d9ecbddff8536" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.583391 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.645103 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:30:58 crc kubenswrapper[4994]: E0310 00:30:58.646037 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="git-clone" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.646064 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="git-clone" Mar 10 00:30:58 crc kubenswrapper[4994]: E0310 00:30:58.646104 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="docker-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.646115 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="docker-build" Mar 10 00:30:58 crc kubenswrapper[4994]: E0310 00:30:58.646133 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="manage-dockerfile" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.646144 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="manage-dockerfile" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.646340 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="docker-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.647627 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.650456 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.651246 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.651446 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.651585 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.671151 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841285 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841341 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841491 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841563 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841636 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbxgv\" (UniqueName: \"kubernetes.io/projected/a0f543ac-d85b-4776-9cae-5475e4a43318-kube-api-access-sbxgv\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841686 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841734 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841768 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841834 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841918 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841982 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.842017 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943393 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943448 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943496 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943530 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943568 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943595 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943635 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbxgv\" (UniqueName: \"kubernetes.io/projected/a0f543ac-d85b-4776-9cae-5475e4a43318-kube-api-access-sbxgv\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943662 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943688 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943712 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943748 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943775 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943983 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.944052 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.944759 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.944860 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.944991 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.945091 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.945353 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.945361 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.945930 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.954924 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.959285 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.974016 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbxgv\" (UniqueName: \"kubernetes.io/projected/a0f543ac-d85b-4776-9cae-5475e4a43318-kube-api-access-sbxgv\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.976917 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:59 crc kubenswrapper[4994]: I0310 00:30:59.491539 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:30:59 crc kubenswrapper[4994]: W0310 00:30:59.492525 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0f543ac_d85b_4776_9cae_5475e4a43318.slice/crio-07bc0c4b967ac20d07fc8fce99808770b0f8bd4f892bee865400735fd44e1313 WatchSource:0}: Error finding container 07bc0c4b967ac20d07fc8fce99808770b0f8bd4f892bee865400735fd44e1313: Status 404 returned error can't find the container with id 07bc0c4b967ac20d07fc8fce99808770b0f8bd4f892bee865400735fd44e1313 Mar 10 00:30:59 crc kubenswrapper[4994]: I0310 00:30:59.618210 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"a0f543ac-d85b-4776-9cae-5475e4a43318","Type":"ContainerStarted","Data":"07bc0c4b967ac20d07fc8fce99808770b0f8bd4f892bee865400735fd44e1313"} Mar 10 00:31:00 crc kubenswrapper[4994]: I0310 00:31:00.631063 4994 generic.go:334] "Generic (PLEG): container finished" podID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerID="74761a9eff3f39fcf080fcf43da99f605fa6e3e4193c77e19940c640d07270e2" exitCode=0 Mar 10 00:31:00 crc kubenswrapper[4994]: I0310 00:31:00.631134 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"a0f543ac-d85b-4776-9cae-5475e4a43318","Type":"ContainerDied","Data":"74761a9eff3f39fcf080fcf43da99f605fa6e3e4193c77e19940c640d07270e2"} Mar 10 00:31:01 crc kubenswrapper[4994]: I0310 00:31:01.642789 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_a0f543ac-d85b-4776-9cae-5475e4a43318/docker-build/0.log" Mar 10 00:31:01 crc kubenswrapper[4994]: I0310 00:31:01.643924 4994 generic.go:334] "Generic (PLEG): container finished" podID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerID="d932d5abafc702ac0d919613d1196b6f6540e0380b55b01cbf3b8f80be098bd1" exitCode=1 Mar 10 00:31:01 crc kubenswrapper[4994]: I0310 00:31:01.643987 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"a0f543ac-d85b-4776-9cae-5475e4a43318","Type":"ContainerDied","Data":"d932d5abafc702ac0d919613d1196b6f6540e0380b55b01cbf3b8f80be098bd1"} Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.018846 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_a0f543ac-d85b-4776-9cae-5475e4a43318/docker-build/0.log" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.020094 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.110637 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-buildcachedir\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.110794 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-push\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.110826 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.110868 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-ca-bundles\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111082 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-build-blob-cache\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111197 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-proxy-ca-bundles\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111353 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbxgv\" (UniqueName: \"kubernetes.io/projected/a0f543ac-d85b-4776-9cae-5475e4a43318-kube-api-access-sbxgv\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111438 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-pull\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111497 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-buildworkdir\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111526 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-node-pullsecrets\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111561 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-root\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111595 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-run\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111634 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-system-configs\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111850 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111974 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111967 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.112315 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.112831 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.113036 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.114327 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.115649 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.116347 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.118592 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.119229 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f543ac-d85b-4776-9cae-5475e4a43318-kube-api-access-sbxgv" (OuterVolumeSpecName: "kube-api-access-sbxgv") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "kube-api-access-sbxgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.119927 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214497 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214553 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214575 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214595 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214613 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214631 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214649 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214665 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214681 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214698 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbxgv\" (UniqueName: \"kubernetes.io/projected/a0f543ac-d85b-4776-9cae-5475e4a43318-kube-api-access-sbxgv\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214717 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.662928 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_a0f543ac-d85b-4776-9cae-5475e4a43318/docker-build/0.log" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.663671 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"a0f543ac-d85b-4776-9cae-5475e4a43318","Type":"ContainerDied","Data":"07bc0c4b967ac20d07fc8fce99808770b0f8bd4f892bee865400735fd44e1313"} Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.663725 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07bc0c4b967ac20d07fc8fce99808770b0f8bd4f892bee865400735fd44e1313" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.663744 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:31:09 crc kubenswrapper[4994]: I0310 00:31:09.120591 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:31:09 crc kubenswrapper[4994]: I0310 00:31:09.132099 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.566780 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f543ac-d85b-4776-9cae-5475e4a43318" path="/var/lib/kubelet/pods/a0f543ac-d85b-4776-9cae-5475e4a43318/volumes" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.769842 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 10 00:31:10 crc kubenswrapper[4994]: E0310 00:31:10.770331 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerName="manage-dockerfile" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.770365 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerName="manage-dockerfile" Mar 10 00:31:10 crc kubenswrapper[4994]: E0310 00:31:10.770404 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerName="docker-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.770418 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerName="docker-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.770655 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerName="docker-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.772748 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.776005 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.776785 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.777392 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.778844 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.779149 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.948722 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.948822 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.948971 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949082 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949139 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949201 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949274 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949355 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949428 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949524 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949653 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949707 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-957nf\" (UniqueName: \"kubernetes.io/projected/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-kube-api-access-957nf\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050580 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050670 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050724 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050815 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050865 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050929 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050976 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050998 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051028 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051114 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051160 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051230 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051280 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051348 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-957nf\" (UniqueName: \"kubernetes.io/projected/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-kube-api-access-957nf\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051432 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.052513 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.053073 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.053279 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.053800 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.053934 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.054281 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.061679 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.061711 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.085546 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-957nf\" (UniqueName: \"kubernetes.io/projected/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-kube-api-access-957nf\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.103162 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.433772 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.779358 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerStarted","Data":"072eb920df43a19ae92b79127c1e5d8130581afedbc74bbcfd7fc4942ec7c1aa"} Mar 10 00:31:12 crc kubenswrapper[4994]: I0310 00:31:12.792731 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerStarted","Data":"0c01f1deef18e844bb567b3557ae2178106db4c54958eedf7439e442df977495"} Mar 10 00:31:13 crc kubenswrapper[4994]: E0310 00:31:13.041642 4994 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.164:47462->38.102.83.164:37473: read tcp 38.102.83.164:47462->38.102.83.164:37473: read: connection reset by peer Mar 10 00:31:13 crc kubenswrapper[4994]: I0310 00:31:13.807621 4994 generic.go:334] "Generic (PLEG): container finished" podID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerID="0c01f1deef18e844bb567b3557ae2178106db4c54958eedf7439e442df977495" exitCode=0 Mar 10 00:31:13 crc kubenswrapper[4994]: I0310 00:31:13.807695 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerDied","Data":"0c01f1deef18e844bb567b3557ae2178106db4c54958eedf7439e442df977495"} Mar 10 00:31:14 crc kubenswrapper[4994]: I0310 00:31:14.819075 4994 generic.go:334] "Generic (PLEG): container finished" podID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerID="3f34506bf5d0590830eeb3f08b1bed951f6f1560b97971ea45d24f66e9003fe6" exitCode=0 Mar 10 00:31:14 crc kubenswrapper[4994]: I0310 00:31:14.819146 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerDied","Data":"3f34506bf5d0590830eeb3f08b1bed951f6f1560b97971ea45d24f66e9003fe6"} Mar 10 00:31:14 crc kubenswrapper[4994]: I0310 00:31:14.878669 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_3dee5c53-e9b2-4246-8d9a-30fa345c1f0f/manage-dockerfile/0.log" Mar 10 00:31:15 crc kubenswrapper[4994]: I0310 00:31:15.833252 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerStarted","Data":"c8f445e7254180f6d08cfdcb52469d60bb6e8cb339809370ede142970d703bd9"} Mar 10 00:31:15 crc kubenswrapper[4994]: I0310 00:31:15.887281 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=5.887228816 podStartE2EDuration="5.887228816s" podCreationTimestamp="2026-03-10 00:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:31:15.876522571 +0000 UTC m=+1490.050229410" watchObservedRunningTime="2026-03-10 00:31:15.887228816 +0000 UTC m=+1490.060935605" Mar 10 00:31:18 crc kubenswrapper[4994]: I0310 00:31:18.892530 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:31:18 crc kubenswrapper[4994]: I0310 00:31:18.893346 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:31:19 crc kubenswrapper[4994]: I0310 00:31:19.883729 4994 generic.go:334] "Generic (PLEG): container finished" podID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerID="c8f445e7254180f6d08cfdcb52469d60bb6e8cb339809370ede142970d703bd9" exitCode=0 Mar 10 00:31:19 crc kubenswrapper[4994]: I0310 00:31:19.883830 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerDied","Data":"c8f445e7254180f6d08cfdcb52469d60bb6e8cb339809370ede142970d703bd9"} Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.246082 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368335 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-957nf\" (UniqueName: \"kubernetes.io/projected/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-kube-api-access-957nf\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368396 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-proxy-ca-bundles\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368422 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-run\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368486 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-system-configs\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368509 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-blob-cache\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368541 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-push\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368563 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildcachedir\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368582 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-node-pullsecrets\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368643 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-root\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368661 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildworkdir\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368686 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-pull\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368718 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-ca-bundles\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.369431 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.369471 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.369573 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.370696 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.372144 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.372277 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.373065 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.373808 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.376048 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-kube-api-access-957nf" (OuterVolumeSpecName: "kube-api-access-957nf") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "kube-api-access-957nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.376903 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.376864 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.384682 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469764 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469804 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-957nf\" (UniqueName: \"kubernetes.io/projected/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-kube-api-access-957nf\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469819 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469830 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469842 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469852 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469906 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469920 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469930 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469941 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469953 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469964 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.905034 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerDied","Data":"072eb920df43a19ae92b79127c1e5d8130581afedbc74bbcfd7fc4942ec7c1aa"} Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.905113 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="072eb920df43a19ae92b79127c1e5d8130581afedbc74bbcfd7fc4942ec7c1aa" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.905125 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.988787 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 10 00:31:37 crc kubenswrapper[4994]: E0310 00:31:37.989808 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="manage-dockerfile" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.989830 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="manage-dockerfile" Mar 10 00:31:37 crc kubenswrapper[4994]: E0310 00:31:37.989868 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="git-clone" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.989909 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="git-clone" Mar 10 00:31:37 crc kubenswrapper[4994]: E0310 00:31:37.989941 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="docker-build" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.989954 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="docker-build" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.990157 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="docker-build" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.991617 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.994317 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.995729 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.998136 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.998418 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.999684 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.013192 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.121613 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.121688 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p44sn\" (UniqueName: \"kubernetes.io/projected/bc817ebb-7ab7-41d2-8961-92191d7749e9-kube-api-access-p44sn\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.121729 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.121861 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122030 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122121 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122220 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122327 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122360 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122375 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122405 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122444 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122461 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224472 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224557 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224596 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224636 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224682 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224713 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224753 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224792 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224826 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p44sn\" (UniqueName: \"kubernetes.io/projected/bc817ebb-7ab7-41d2-8961-92191d7749e9-kube-api-access-p44sn\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224893 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224944 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224997 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.225054 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.225364 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.225903 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.226041 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.226046 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.226124 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.226188 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.226713 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.227181 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.227182 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.235781 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.236179 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.241493 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.249391 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p44sn\" (UniqueName: \"kubernetes.io/projected/bc817ebb-7ab7-41d2-8961-92191d7749e9-kube-api-access-p44sn\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.313594 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.631399 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 10 00:31:39 crc kubenswrapper[4994]: I0310 00:31:39.087969 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerStarted","Data":"5982f0ac6e6cb6b4ab7545dbf91d9e7d9897958af8f6e1bcb23dd95fc5ffd82f"} Mar 10 00:31:39 crc kubenswrapper[4994]: I0310 00:31:39.088033 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerStarted","Data":"d460ca6197c6169b8175da1780d793119656b56b9e6b3e7442efa458df8fdfa3"} Mar 10 00:31:40 crc kubenswrapper[4994]: I0310 00:31:40.100036 4994 generic.go:334] "Generic (PLEG): container finished" podID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerID="5982f0ac6e6cb6b4ab7545dbf91d9e7d9897958af8f6e1bcb23dd95fc5ffd82f" exitCode=0 Mar 10 00:31:40 crc kubenswrapper[4994]: I0310 00:31:40.100137 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerDied","Data":"5982f0ac6e6cb6b4ab7545dbf91d9e7d9897958af8f6e1bcb23dd95fc5ffd82f"} Mar 10 00:31:41 crc kubenswrapper[4994]: I0310 00:31:41.110270 4994 generic.go:334] "Generic (PLEG): container finished" podID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerID="6706d407faeb67d398bcf4d8c3facdfecbf69671980d8bbc6f719c8f6a184918" exitCode=0 Mar 10 00:31:41 crc kubenswrapper[4994]: I0310 00:31:41.110322 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerDied","Data":"6706d407faeb67d398bcf4d8c3facdfecbf69671980d8bbc6f719c8f6a184918"} Mar 10 00:31:41 crc kubenswrapper[4994]: I0310 00:31:41.161863 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_bc817ebb-7ab7-41d2-8961-92191d7749e9/manage-dockerfile/0.log" Mar 10 00:31:42 crc kubenswrapper[4994]: I0310 00:31:42.126496 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerStarted","Data":"3d75614a57566feda03e9701c8258587fc4a6f9dcc314b90f72c6fbb29937a14"} Mar 10 00:31:42 crc kubenswrapper[4994]: I0310 00:31:42.179360 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=5.179335128 podStartE2EDuration="5.179335128s" podCreationTimestamp="2026-03-10 00:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:31:42.172070453 +0000 UTC m=+1516.345777242" watchObservedRunningTime="2026-03-10 00:31:42.179335128 +0000 UTC m=+1516.353041917" Mar 10 00:31:48 crc kubenswrapper[4994]: I0310 00:31:48.892732 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:31:48 crc kubenswrapper[4994]: I0310 00:31:48.893420 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.148944 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551712-nx9pb"] Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.150351 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.153201 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.153345 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.153437 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.169544 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551712-nx9pb"] Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.230583 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggdf8\" (UniqueName: \"kubernetes.io/projected/615394b2-0705-4358-853e-8c52eb448519-kube-api-access-ggdf8\") pod \"auto-csr-approver-29551712-nx9pb\" (UID: \"615394b2-0705-4358-853e-8c52eb448519\") " pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.332954 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggdf8\" (UniqueName: \"kubernetes.io/projected/615394b2-0705-4358-853e-8c52eb448519-kube-api-access-ggdf8\") pod \"auto-csr-approver-29551712-nx9pb\" (UID: \"615394b2-0705-4358-853e-8c52eb448519\") " pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.375608 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggdf8\" (UniqueName: \"kubernetes.io/projected/615394b2-0705-4358-853e-8c52eb448519-kube-api-access-ggdf8\") pod \"auto-csr-approver-29551712-nx9pb\" (UID: \"615394b2-0705-4358-853e-8c52eb448519\") " pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.483956 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.731668 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551712-nx9pb"] Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.740979 4994 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:32:01 crc kubenswrapper[4994]: I0310 00:32:01.269993 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" event={"ID":"615394b2-0705-4358-853e-8c52eb448519","Type":"ContainerStarted","Data":"03e1ec631057a240591c7aee984d946412c033bd597ebc69609979acd12bee59"} Mar 10 00:32:03 crc kubenswrapper[4994]: I0310 00:32:03.289990 4994 generic.go:334] "Generic (PLEG): container finished" podID="615394b2-0705-4358-853e-8c52eb448519" containerID="ab12f6f7b139f927c15eec55fa9992338c6ae56c8336c6e012df890d87e1461b" exitCode=0 Mar 10 00:32:03 crc kubenswrapper[4994]: I0310 00:32:03.290088 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" event={"ID":"615394b2-0705-4358-853e-8c52eb448519","Type":"ContainerDied","Data":"ab12f6f7b139f927c15eec55fa9992338c6ae56c8336c6e012df890d87e1461b"} Mar 10 00:32:04 crc kubenswrapper[4994]: I0310 00:32:04.590139 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:04 crc kubenswrapper[4994]: I0310 00:32:04.696457 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggdf8\" (UniqueName: \"kubernetes.io/projected/615394b2-0705-4358-853e-8c52eb448519-kube-api-access-ggdf8\") pod \"615394b2-0705-4358-853e-8c52eb448519\" (UID: \"615394b2-0705-4358-853e-8c52eb448519\") " Mar 10 00:32:04 crc kubenswrapper[4994]: I0310 00:32:04.703933 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615394b2-0705-4358-853e-8c52eb448519-kube-api-access-ggdf8" (OuterVolumeSpecName: "kube-api-access-ggdf8") pod "615394b2-0705-4358-853e-8c52eb448519" (UID: "615394b2-0705-4358-853e-8c52eb448519"). InnerVolumeSpecName "kube-api-access-ggdf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:04 crc kubenswrapper[4994]: I0310 00:32:04.798119 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggdf8\" (UniqueName: \"kubernetes.io/projected/615394b2-0705-4358-853e-8c52eb448519-kube-api-access-ggdf8\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:05 crc kubenswrapper[4994]: I0310 00:32:05.312445 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" event={"ID":"615394b2-0705-4358-853e-8c52eb448519","Type":"ContainerDied","Data":"03e1ec631057a240591c7aee984d946412c033bd597ebc69609979acd12bee59"} Mar 10 00:32:05 crc kubenswrapper[4994]: I0310 00:32:05.312533 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03e1ec631057a240591c7aee984d946412c033bd597ebc69609979acd12bee59" Mar 10 00:32:05 crc kubenswrapper[4994]: I0310 00:32:05.312671 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:05 crc kubenswrapper[4994]: I0310 00:32:05.670359 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551706-trj4h"] Mar 10 00:32:05 crc kubenswrapper[4994]: I0310 00:32:05.679141 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551706-trj4h"] Mar 10 00:32:06 crc kubenswrapper[4994]: I0310 00:32:06.562924 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c045a416-3fda-4dc3-b95a-15be10565d84" path="/var/lib/kubelet/pods/c045a416-3fda-4dc3-b95a-15be10565d84/volumes" Mar 10 00:32:12 crc kubenswrapper[4994]: I0310 00:32:12.373561 4994 generic.go:334] "Generic (PLEG): container finished" podID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerID="3d75614a57566feda03e9701c8258587fc4a6f9dcc314b90f72c6fbb29937a14" exitCode=0 Mar 10 00:32:12 crc kubenswrapper[4994]: I0310 00:32:12.373782 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerDied","Data":"3d75614a57566feda03e9701c8258587fc4a6f9dcc314b90f72c6fbb29937a14"} Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.686773 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830752 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-ca-bundles\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830806 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildworkdir\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830829 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-proxy-ca-bundles\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830866 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-blob-cache\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830913 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-push\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830941 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-root\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830965 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildcachedir\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831000 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-run\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831046 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-system-configs\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831089 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831116 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p44sn\" (UniqueName: \"kubernetes.io/projected/bc817ebb-7ab7-41d2-8961-92191d7749e9-kube-api-access-p44sn\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831164 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-pull\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831209 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-node-pullsecrets\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831463 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831934 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.832195 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.832320 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.832734 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.833451 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.834127 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.840138 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.842142 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.842396 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.843439 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc817ebb-7ab7-41d2-8961-92191d7749e9-kube-api-access-p44sn" (OuterVolumeSpecName: "kube-api-access-p44sn") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "kube-api-access-p44sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.932966 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933005 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933019 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933031 4994 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933044 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p44sn\" (UniqueName: \"kubernetes.io/projected/bc817ebb-7ab7-41d2-8961-92191d7749e9-kube-api-access-p44sn\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933055 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933066 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933076 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933090 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933100 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933112 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:14 crc kubenswrapper[4994]: I0310 00:32:14.062426 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:14 crc kubenswrapper[4994]: I0310 00:32:14.136556 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:14 crc kubenswrapper[4994]: I0310 00:32:14.394090 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerDied","Data":"d460ca6197c6169b8175da1780d793119656b56b9e6b3e7442efa458df8fdfa3"} Mar 10 00:32:14 crc kubenswrapper[4994]: I0310 00:32:14.394760 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d460ca6197c6169b8175da1780d793119656b56b9e6b3e7442efa458df8fdfa3" Mar 10 00:32:14 crc kubenswrapper[4994]: I0310 00:32:14.394191 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.920439 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-sf9c4"] Mar 10 00:32:15 crc kubenswrapper[4994]: E0310 00:32:15.921286 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="git-clone" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.921312 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="git-clone" Mar 10 00:32:15 crc kubenswrapper[4994]: E0310 00:32:15.921339 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="docker-build" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.921352 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="docker-build" Mar 10 00:32:15 crc kubenswrapper[4994]: E0310 00:32:15.921377 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615394b2-0705-4358-853e-8c52eb448519" containerName="oc" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.921389 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="615394b2-0705-4358-853e-8c52eb448519" containerName="oc" Mar 10 00:32:15 crc kubenswrapper[4994]: E0310 00:32:15.921413 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="manage-dockerfile" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.921426 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="manage-dockerfile" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.921629 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="615394b2-0705-4358-853e-8c52eb448519" containerName="oc" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.921646 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="docker-build" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.922381 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.929100 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-nfz6x" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.957756 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-sf9c4"] Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.084207 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d4jz\" (UniqueName: \"kubernetes.io/projected/01668f0d-50fe-449b-9fc8-2b949a68bb4e-kube-api-access-8d4jz\") pod \"infrawatch-operators-sf9c4\" (UID: \"01668f0d-50fe-449b-9fc8-2b949a68bb4e\") " pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.185479 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d4jz\" (UniqueName: \"kubernetes.io/projected/01668f0d-50fe-449b-9fc8-2b949a68bb4e-kube-api-access-8d4jz\") pod \"infrawatch-operators-sf9c4\" (UID: \"01668f0d-50fe-449b-9fc8-2b949a68bb4e\") " pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.208652 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d4jz\" (UniqueName: \"kubernetes.io/projected/01668f0d-50fe-449b-9fc8-2b949a68bb4e-kube-api-access-8d4jz\") pod \"infrawatch-operators-sf9c4\" (UID: \"01668f0d-50fe-449b-9fc8-2b949a68bb4e\") " pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.286369 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.388974 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.487471 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-sf9c4"] Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.490239 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:17 crc kubenswrapper[4994]: I0310 00:32:17.416233 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-sf9c4" event={"ID":"01668f0d-50fe-449b-9fc8-2b949a68bb4e","Type":"ContainerStarted","Data":"ec33abc3badf44282af2cba3baa8e66b6f8f9dce90c50cf57dd91d9f331c1b26"} Mar 10 00:32:18 crc kubenswrapper[4994]: I0310 00:32:18.892197 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:32:18 crc kubenswrapper[4994]: I0310 00:32:18.892272 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:32:18 crc kubenswrapper[4994]: I0310 00:32:18.892330 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:32:18 crc kubenswrapper[4994]: I0310 00:32:18.893151 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8d935625d60ec1fe79acd428aa0c427cb2a184ba8e0a37f25ea8bb9485e5629"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:32:18 crc kubenswrapper[4994]: I0310 00:32:18.893239 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://d8d935625d60ec1fe79acd428aa0c427cb2a184ba8e0a37f25ea8bb9485e5629" gracePeriod=600 Mar 10 00:32:19 crc kubenswrapper[4994]: I0310 00:32:19.431773 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="d8d935625d60ec1fe79acd428aa0c427cb2a184ba8e0a37f25ea8bb9485e5629" exitCode=0 Mar 10 00:32:19 crc kubenswrapper[4994]: I0310 00:32:19.432106 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"d8d935625d60ec1fe79acd428aa0c427cb2a184ba8e0a37f25ea8bb9485e5629"} Mar 10 00:32:19 crc kubenswrapper[4994]: I0310 00:32:19.432251 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b"} Mar 10 00:32:19 crc kubenswrapper[4994]: I0310 00:32:19.432272 4994 scope.go:117] "RemoveContainer" containerID="779b11783e082c837efecec96b026cd3be87293636b7184dfd3efe1ae146c491" Mar 10 00:32:20 crc kubenswrapper[4994]: I0310 00:32:20.704677 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-sf9c4"] Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.524070 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-hhvb6"] Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.525481 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-hhvb6"] Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.525605 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.662763 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh9hc\" (UniqueName: \"kubernetes.io/projected/ea63ae5e-58aa-4f18-b14b-514e618f4839-kube-api-access-zh9hc\") pod \"infrawatch-operators-hhvb6\" (UID: \"ea63ae5e-58aa-4f18-b14b-514e618f4839\") " pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.764195 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh9hc\" (UniqueName: \"kubernetes.io/projected/ea63ae5e-58aa-4f18-b14b-514e618f4839-kube-api-access-zh9hc\") pod \"infrawatch-operators-hhvb6\" (UID: \"ea63ae5e-58aa-4f18-b14b-514e618f4839\") " pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.782700 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh9hc\" (UniqueName: \"kubernetes.io/projected/ea63ae5e-58aa-4f18-b14b-514e618f4839-kube-api-access-zh9hc\") pod \"infrawatch-operators-hhvb6\" (UID: \"ea63ae5e-58aa-4f18-b14b-514e618f4839\") " pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.879428 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:22 crc kubenswrapper[4994]: I0310 00:32:22.225491 4994 scope.go:117] "RemoveContainer" containerID="4533fb5052fb646d2bdd6d148243818a0541d9f71db00a897459732577383e18" Mar 10 00:32:29 crc kubenswrapper[4994]: I0310 00:32:29.182451 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-hhvb6"] Mar 10 00:32:29 crc kubenswrapper[4994]: W0310 00:32:29.278852 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea63ae5e_58aa_4f18_b14b_514e618f4839.slice/crio-58e384788fdc7cd4b21d43b780461a6b0945556ce063d139f11eabf1041cba75 WatchSource:0}: Error finding container 58e384788fdc7cd4b21d43b780461a6b0945556ce063d139f11eabf1041cba75: Status 404 returned error can't find the container with id 58e384788fdc7cd4b21d43b780461a6b0945556ce063d139f11eabf1041cba75 Mar 10 00:32:29 crc kubenswrapper[4994]: I0310 00:32:29.517846 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-hhvb6" event={"ID":"ea63ae5e-58aa-4f18-b14b-514e618f4839","Type":"ContainerStarted","Data":"58e384788fdc7cd4b21d43b780461a6b0945556ce063d139f11eabf1041cba75"} Mar 10 00:32:30 crc kubenswrapper[4994]: I0310 00:32:30.527564 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-sf9c4" event={"ID":"01668f0d-50fe-449b-9fc8-2b949a68bb4e","Type":"ContainerStarted","Data":"9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6"} Mar 10 00:32:30 crc kubenswrapper[4994]: I0310 00:32:30.527691 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-sf9c4" podUID="01668f0d-50fe-449b-9fc8-2b949a68bb4e" containerName="registry-server" containerID="cri-o://9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6" gracePeriod=2 Mar 10 00:32:30 crc kubenswrapper[4994]: I0310 00:32:30.530029 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-hhvb6" event={"ID":"ea63ae5e-58aa-4f18-b14b-514e618f4839","Type":"ContainerStarted","Data":"d951629f67a54bec89647252d09794985e055265472771262c3e440dc915a918"} Mar 10 00:32:30 crc kubenswrapper[4994]: I0310 00:32:30.547854 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-sf9c4" podStartSLOduration=2.56385 podStartE2EDuration="15.547831781s" podCreationTimestamp="2026-03-10 00:32:15 +0000 UTC" firstStartedPulling="2026-03-10 00:32:16.495663307 +0000 UTC m=+1550.669370056" lastFinishedPulling="2026-03-10 00:32:29.479645058 +0000 UTC m=+1563.653351837" observedRunningTime="2026-03-10 00:32:30.546135657 +0000 UTC m=+1564.719842406" watchObservedRunningTime="2026-03-10 00:32:30.547831781 +0000 UTC m=+1564.721538530" Mar 10 00:32:30 crc kubenswrapper[4994]: I0310 00:32:30.572897 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-hhvb6" podStartSLOduration=9.425579737 podStartE2EDuration="9.572867561s" podCreationTimestamp="2026-03-10 00:32:21 +0000 UTC" firstStartedPulling="2026-03-10 00:32:29.282447939 +0000 UTC m=+1563.456154708" lastFinishedPulling="2026-03-10 00:32:29.429735773 +0000 UTC m=+1563.603442532" observedRunningTime="2026-03-10 00:32:30.56776765 +0000 UTC m=+1564.741474399" watchObservedRunningTime="2026-03-10 00:32:30.572867561 +0000 UTC m=+1564.746574310" Mar 10 00:32:30 crc kubenswrapper[4994]: I0310 00:32:30.943945 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.006945 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d4jz\" (UniqueName: \"kubernetes.io/projected/01668f0d-50fe-449b-9fc8-2b949a68bb4e-kube-api-access-8d4jz\") pod \"01668f0d-50fe-449b-9fc8-2b949a68bb4e\" (UID: \"01668f0d-50fe-449b-9fc8-2b949a68bb4e\") " Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.013490 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01668f0d-50fe-449b-9fc8-2b949a68bb4e-kube-api-access-8d4jz" (OuterVolumeSpecName: "kube-api-access-8d4jz") pod "01668f0d-50fe-449b-9fc8-2b949a68bb4e" (UID: "01668f0d-50fe-449b-9fc8-2b949a68bb4e"). InnerVolumeSpecName "kube-api-access-8d4jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.108541 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d4jz\" (UniqueName: \"kubernetes.io/projected/01668f0d-50fe-449b-9fc8-2b949a68bb4e-kube-api-access-8d4jz\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.544588 4994 generic.go:334] "Generic (PLEG): container finished" podID="01668f0d-50fe-449b-9fc8-2b949a68bb4e" containerID="9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6" exitCode=0 Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.546194 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.547557 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-sf9c4" event={"ID":"01668f0d-50fe-449b-9fc8-2b949a68bb4e","Type":"ContainerDied","Data":"9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6"} Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.547629 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-sf9c4" event={"ID":"01668f0d-50fe-449b-9fc8-2b949a68bb4e","Type":"ContainerDied","Data":"ec33abc3badf44282af2cba3baa8e66b6f8f9dce90c50cf57dd91d9f331c1b26"} Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.547660 4994 scope.go:117] "RemoveContainer" containerID="9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.612296 4994 scope.go:117] "RemoveContainer" containerID="9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6" Mar 10 00:32:31 crc kubenswrapper[4994]: E0310 00:32:31.614434 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6\": container with ID starting with 9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6 not found: ID does not exist" containerID="9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.614493 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6"} err="failed to get container status \"9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6\": rpc error: code = NotFound desc = could not find container \"9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6\": container with ID starting with 9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6 not found: ID does not exist" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.632972 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-sf9c4"] Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.643198 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-sf9c4"] Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.880518 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.880967 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.923805 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:32 crc kubenswrapper[4994]: I0310 00:32:32.566048 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01668f0d-50fe-449b-9fc8-2b949a68bb4e" path="/var/lib/kubelet/pods/01668f0d-50fe-449b-9fc8-2b949a68bb4e/volumes" Mar 10 00:32:41 crc kubenswrapper[4994]: I0310 00:32:41.934949 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.174842 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99"] Mar 10 00:32:44 crc kubenswrapper[4994]: E0310 00:32:44.176105 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01668f0d-50fe-449b-9fc8-2b949a68bb4e" containerName="registry-server" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.176201 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="01668f0d-50fe-449b-9fc8-2b949a68bb4e" containerName="registry-server" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.176440 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="01668f0d-50fe-449b-9fc8-2b949a68bb4e" containerName="registry-server" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.177536 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.194978 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99"] Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.298569 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.298649 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.299048 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6vjf\" (UniqueName: \"kubernetes.io/projected/c127337a-56e6-4642-b020-920e566abbd8-kube-api-access-q6vjf\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.400710 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.400799 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.400979 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6vjf\" (UniqueName: \"kubernetes.io/projected/c127337a-56e6-4642-b020-920e566abbd8-kube-api-access-q6vjf\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.402102 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.402204 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.452543 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6vjf\" (UniqueName: \"kubernetes.io/projected/c127337a-56e6-4642-b020-920e566abbd8-kube-api-access-q6vjf\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.513782 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.848298 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99"] Mar 10 00:32:44 crc kubenswrapper[4994]: W0310 00:32:44.852153 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc127337a_56e6_4642_b020_920e566abbd8.slice/crio-8baf587def5b5252d7f47162fd8fbc2fc7e54f6a059e943333d7cd9fd8dfe3f3 WatchSource:0}: Error finding container 8baf587def5b5252d7f47162fd8fbc2fc7e54f6a059e943333d7cd9fd8dfe3f3: Status 404 returned error can't find the container with id 8baf587def5b5252d7f47162fd8fbc2fc7e54f6a059e943333d7cd9fd8dfe3f3 Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.976814 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7"] Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.978902 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.992317 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7"] Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.010547 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.010596 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j48cc\" (UniqueName: \"kubernetes.io/projected/8d5571bf-db5f-44e5-90a1-498f2f969ca8-kube-api-access-j48cc\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.010642 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.111558 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.111711 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.111774 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j48cc\" (UniqueName: \"kubernetes.io/projected/8d5571bf-db5f-44e5-90a1-498f2f969ca8-kube-api-access-j48cc\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.112654 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.112746 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.136603 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j48cc\" (UniqueName: \"kubernetes.io/projected/8d5571bf-db5f-44e5-90a1-498f2f969ca8-kube-api-access-j48cc\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.310749 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.706070 4994 generic.go:334] "Generic (PLEG): container finished" podID="c127337a-56e6-4642-b020-920e566abbd8" containerID="e78f149f3ee66141e736adb1d0dc9a4c936fa8f53c768cb49326775840a68b0e" exitCode=0 Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.706210 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" event={"ID":"c127337a-56e6-4642-b020-920e566abbd8","Type":"ContainerDied","Data":"e78f149f3ee66141e736adb1d0dc9a4c936fa8f53c768cb49326775840a68b0e"} Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.706561 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" event={"ID":"c127337a-56e6-4642-b020-920e566abbd8","Type":"ContainerStarted","Data":"8baf587def5b5252d7f47162fd8fbc2fc7e54f6a059e943333d7cd9fd8dfe3f3"} Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.837682 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7"] Mar 10 00:32:45 crc kubenswrapper[4994]: W0310 00:32:45.838952 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d5571bf_db5f_44e5_90a1_498f2f969ca8.slice/crio-86e7a6bfb00006ef9c88c38d656ae4dd22b5e7518ad52ea26c2f316565df2f46 WatchSource:0}: Error finding container 86e7a6bfb00006ef9c88c38d656ae4dd22b5e7518ad52ea26c2f316565df2f46: Status 404 returned error can't find the container with id 86e7a6bfb00006ef9c88c38d656ae4dd22b5e7518ad52ea26c2f316565df2f46 Mar 10 00:32:46 crc kubenswrapper[4994]: I0310 00:32:46.718304 4994 generic.go:334] "Generic (PLEG): container finished" podID="c127337a-56e6-4642-b020-920e566abbd8" containerID="eb83498b13dc98bff97d5a0400a7387ef755246e02a753234d0ab169f8b47795" exitCode=0 Mar 10 00:32:46 crc kubenswrapper[4994]: I0310 00:32:46.718392 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" event={"ID":"c127337a-56e6-4642-b020-920e566abbd8","Type":"ContainerDied","Data":"eb83498b13dc98bff97d5a0400a7387ef755246e02a753234d0ab169f8b47795"} Mar 10 00:32:46 crc kubenswrapper[4994]: I0310 00:32:46.723213 4994 generic.go:334] "Generic (PLEG): container finished" podID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerID="4d07a4d08afb8e9202cbc788aac3bbf88b51a45d2b4a34125ca3aa3c061a0f90" exitCode=0 Mar 10 00:32:46 crc kubenswrapper[4994]: I0310 00:32:46.723264 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" event={"ID":"8d5571bf-db5f-44e5-90a1-498f2f969ca8","Type":"ContainerDied","Data":"4d07a4d08afb8e9202cbc788aac3bbf88b51a45d2b4a34125ca3aa3c061a0f90"} Mar 10 00:32:46 crc kubenswrapper[4994]: I0310 00:32:46.723303 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" event={"ID":"8d5571bf-db5f-44e5-90a1-498f2f969ca8","Type":"ContainerStarted","Data":"86e7a6bfb00006ef9c88c38d656ae4dd22b5e7518ad52ea26c2f316565df2f46"} Mar 10 00:32:47 crc kubenswrapper[4994]: I0310 00:32:47.738842 4994 generic.go:334] "Generic (PLEG): container finished" podID="c127337a-56e6-4642-b020-920e566abbd8" containerID="8dd12520e968c32892f2a71390fb9c5d48b8062b13352b7e632f3e5e28318421" exitCode=0 Mar 10 00:32:47 crc kubenswrapper[4994]: I0310 00:32:47.738938 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" event={"ID":"c127337a-56e6-4642-b020-920e566abbd8","Type":"ContainerDied","Data":"8dd12520e968c32892f2a71390fb9c5d48b8062b13352b7e632f3e5e28318421"} Mar 10 00:32:47 crc kubenswrapper[4994]: I0310 00:32:47.742936 4994 generic.go:334] "Generic (PLEG): container finished" podID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerID="6ab6fc48ccef71475117dba5680a4efdfd25a0886352f7b62443696caf5ea0ac" exitCode=0 Mar 10 00:32:47 crc kubenswrapper[4994]: I0310 00:32:47.743010 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" event={"ID":"8d5571bf-db5f-44e5-90a1-498f2f969ca8","Type":"ContainerDied","Data":"6ab6fc48ccef71475117dba5680a4efdfd25a0886352f7b62443696caf5ea0ac"} Mar 10 00:32:48 crc kubenswrapper[4994]: I0310 00:32:48.753678 4994 generic.go:334] "Generic (PLEG): container finished" podID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerID="3f89fc695b0342472abe2241f0c45238e106e6bb5e75eded193907abdb1b81a1" exitCode=0 Mar 10 00:32:48 crc kubenswrapper[4994]: I0310 00:32:48.753779 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" event={"ID":"8d5571bf-db5f-44e5-90a1-498f2f969ca8","Type":"ContainerDied","Data":"3f89fc695b0342472abe2241f0c45238e106e6bb5e75eded193907abdb1b81a1"} Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.050077 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.066850 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-bundle\") pod \"c127337a-56e6-4642-b020-920e566abbd8\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.067084 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6vjf\" (UniqueName: \"kubernetes.io/projected/c127337a-56e6-4642-b020-920e566abbd8-kube-api-access-q6vjf\") pod \"c127337a-56e6-4642-b020-920e566abbd8\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.067324 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-util\") pod \"c127337a-56e6-4642-b020-920e566abbd8\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.067950 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-bundle" (OuterVolumeSpecName: "bundle") pod "c127337a-56e6-4642-b020-920e566abbd8" (UID: "c127337a-56e6-4642-b020-920e566abbd8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.079127 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c127337a-56e6-4642-b020-920e566abbd8-kube-api-access-q6vjf" (OuterVolumeSpecName: "kube-api-access-q6vjf") pod "c127337a-56e6-4642-b020-920e566abbd8" (UID: "c127337a-56e6-4642-b020-920e566abbd8"). InnerVolumeSpecName "kube-api-access-q6vjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.091543 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-util" (OuterVolumeSpecName: "util") pod "c127337a-56e6-4642-b020-920e566abbd8" (UID: "c127337a-56e6-4642-b020-920e566abbd8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.169055 4994 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.169100 4994 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.169119 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6vjf\" (UniqueName: \"kubernetes.io/projected/c127337a-56e6-4642-b020-920e566abbd8-kube-api-access-q6vjf\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.766024 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.766013 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" event={"ID":"c127337a-56e6-4642-b020-920e566abbd8","Type":"ContainerDied","Data":"8baf587def5b5252d7f47162fd8fbc2fc7e54f6a059e943333d7cd9fd8dfe3f3"} Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.766110 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8baf587def5b5252d7f47162fd8fbc2fc7e54f6a059e943333d7cd9fd8dfe3f3" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.104683 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.181821 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j48cc\" (UniqueName: \"kubernetes.io/projected/8d5571bf-db5f-44e5-90a1-498f2f969ca8-kube-api-access-j48cc\") pod \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.181943 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-bundle\") pod \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.181998 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-util\") pod \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.183071 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-bundle" (OuterVolumeSpecName: "bundle") pod "8d5571bf-db5f-44e5-90a1-498f2f969ca8" (UID: "8d5571bf-db5f-44e5-90a1-498f2f969ca8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.183480 4994 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.191274 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5571bf-db5f-44e5-90a1-498f2f969ca8-kube-api-access-j48cc" (OuterVolumeSpecName: "kube-api-access-j48cc") pod "8d5571bf-db5f-44e5-90a1-498f2f969ca8" (UID: "8d5571bf-db5f-44e5-90a1-498f2f969ca8"). InnerVolumeSpecName "kube-api-access-j48cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.200519 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-util" (OuterVolumeSpecName: "util") pod "8d5571bf-db5f-44e5-90a1-498f2f969ca8" (UID: "8d5571bf-db5f-44e5-90a1-498f2f969ca8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.284855 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j48cc\" (UniqueName: \"kubernetes.io/projected/8d5571bf-db5f-44e5-90a1-498f2f969ca8-kube-api-access-j48cc\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.284923 4994 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.797446 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.798613 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" event={"ID":"8d5571bf-db5f-44e5-90a1-498f2f969ca8","Type":"ContainerDied","Data":"86e7a6bfb00006ef9c88c38d656ae4dd22b5e7518ad52ea26c2f316565df2f46"} Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.798649 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86e7a6bfb00006ef9c88c38d656ae4dd22b5e7518ad52ea26c2f316565df2f46" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.470931 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-656df8f446-7rqn6"] Mar 10 00:32:57 crc kubenswrapper[4994]: E0310 00:32:57.471783 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="extract" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.471798 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="extract" Mar 10 00:32:57 crc kubenswrapper[4994]: E0310 00:32:57.471810 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="pull" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.471819 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="pull" Mar 10 00:32:57 crc kubenswrapper[4994]: E0310 00:32:57.471830 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="util" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.471839 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="util" Mar 10 00:32:57 crc kubenswrapper[4994]: E0310 00:32:57.471849 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="extract" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.471857 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="extract" Mar 10 00:32:57 crc kubenswrapper[4994]: E0310 00:32:57.471910 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="pull" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.471918 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="pull" Mar 10 00:32:57 crc kubenswrapper[4994]: E0310 00:32:57.471931 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="util" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.471939 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="util" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.472120 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="extract" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.472143 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="extract" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.472685 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.477457 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-rvb57" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.500320 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-656df8f446-7rqn6"] Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.233960 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/134b5ce4-37cf-459f-9c27-dafae8eb9e86-runner\") pod \"service-telemetry-operator-656df8f446-7rqn6\" (UID: \"134b5ce4-37cf-459f-9c27-dafae8eb9e86\") " pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.234041 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp66c\" (UniqueName: \"kubernetes.io/projected/134b5ce4-37cf-459f-9c27-dafae8eb9e86-kube-api-access-qp66c\") pod \"service-telemetry-operator-656df8f446-7rqn6\" (UID: \"134b5ce4-37cf-459f-9c27-dafae8eb9e86\") " pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.336411 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/134b5ce4-37cf-459f-9c27-dafae8eb9e86-runner\") pod \"service-telemetry-operator-656df8f446-7rqn6\" (UID: \"134b5ce4-37cf-459f-9c27-dafae8eb9e86\") " pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.336516 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp66c\" (UniqueName: \"kubernetes.io/projected/134b5ce4-37cf-459f-9c27-dafae8eb9e86-kube-api-access-qp66c\") pod \"service-telemetry-operator-656df8f446-7rqn6\" (UID: \"134b5ce4-37cf-459f-9c27-dafae8eb9e86\") " pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.337891 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/134b5ce4-37cf-459f-9c27-dafae8eb9e86-runner\") pod \"service-telemetry-operator-656df8f446-7rqn6\" (UID: \"134b5ce4-37cf-459f-9c27-dafae8eb9e86\") " pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.366815 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp66c\" (UniqueName: \"kubernetes.io/projected/134b5ce4-37cf-459f-9c27-dafae8eb9e86-kube-api-access-qp66c\") pod \"service-telemetry-operator-656df8f446-7rqn6\" (UID: \"134b5ce4-37cf-459f-9c27-dafae8eb9e86\") " pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.390534 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.893426 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-656df8f446-7rqn6"] Mar 10 00:32:59 crc kubenswrapper[4994]: I0310 00:32:59.281537 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" event={"ID":"134b5ce4-37cf-459f-9c27-dafae8eb9e86","Type":"ContainerStarted","Data":"940bffb00a849370e8dfc93b5d044295c971e9d27843d9549585d5e848cb7a5c"} Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.090448 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-7457956966-kbwlx"] Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.091491 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.097797 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-svrw7" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.114675 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-7457956966-kbwlx"] Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.162047 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/0730e042-f632-4db2-a694-b5917982d77d-runner\") pod \"smart-gateway-operator-7457956966-kbwlx\" (UID: \"0730e042-f632-4db2-a694-b5917982d77d\") " pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.162113 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cck8\" (UniqueName: \"kubernetes.io/projected/0730e042-f632-4db2-a694-b5917982d77d-kube-api-access-8cck8\") pod \"smart-gateway-operator-7457956966-kbwlx\" (UID: \"0730e042-f632-4db2-a694-b5917982d77d\") " pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.263147 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/0730e042-f632-4db2-a694-b5917982d77d-runner\") pod \"smart-gateway-operator-7457956966-kbwlx\" (UID: \"0730e042-f632-4db2-a694-b5917982d77d\") " pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.263208 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cck8\" (UniqueName: \"kubernetes.io/projected/0730e042-f632-4db2-a694-b5917982d77d-kube-api-access-8cck8\") pod \"smart-gateway-operator-7457956966-kbwlx\" (UID: \"0730e042-f632-4db2-a694-b5917982d77d\") " pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.263590 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/0730e042-f632-4db2-a694-b5917982d77d-runner\") pod \"smart-gateway-operator-7457956966-kbwlx\" (UID: \"0730e042-f632-4db2-a694-b5917982d77d\") " pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.289415 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cck8\" (UniqueName: \"kubernetes.io/projected/0730e042-f632-4db2-a694-b5917982d77d-kube-api-access-8cck8\") pod \"smart-gateway-operator-7457956966-kbwlx\" (UID: \"0730e042-f632-4db2-a694-b5917982d77d\") " pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.413357 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.907172 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-7457956966-kbwlx"] Mar 10 00:33:01 crc kubenswrapper[4994]: I0310 00:33:01.300482 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" event={"ID":"0730e042-f632-4db2-a694-b5917982d77d","Type":"ContainerStarted","Data":"df25a59175fadab10f6d5b5817cbff5686da84c6460cba031ba25774701f5646"} Mar 10 00:33:19 crc kubenswrapper[4994]: I0310 00:33:19.474172 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" event={"ID":"134b5ce4-37cf-459f-9c27-dafae8eb9e86","Type":"ContainerStarted","Data":"e70afd2fb89cca0d4de8bfa3e1e955a780443e3c84f94c175ab446950c617f0d"} Mar 10 00:33:19 crc kubenswrapper[4994]: I0310 00:33:19.501528 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" podStartSLOduration=2.144236077 podStartE2EDuration="22.501508987s" podCreationTimestamp="2026-03-10 00:32:57 +0000 UTC" firstStartedPulling="2026-03-10 00:32:58.914665802 +0000 UTC m=+1593.088372551" lastFinishedPulling="2026-03-10 00:33:19.271938722 +0000 UTC m=+1613.445645461" observedRunningTime="2026-03-10 00:33:19.494926579 +0000 UTC m=+1613.668633338" watchObservedRunningTime="2026-03-10 00:33:19.501508987 +0000 UTC m=+1613.675215736" Mar 10 00:33:24 crc kubenswrapper[4994]: I0310 00:33:24.525022 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" event={"ID":"0730e042-f632-4db2-a694-b5917982d77d","Type":"ContainerStarted","Data":"c1a08e50b6d31c063393515002dd019ec4d9fcbf36d6f94b647697a1b6cbcb54"} Mar 10 00:33:24 crc kubenswrapper[4994]: I0310 00:33:24.550590 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" podStartSLOduration=1.315657342 podStartE2EDuration="24.550563684s" podCreationTimestamp="2026-03-10 00:33:00 +0000 UTC" firstStartedPulling="2026-03-10 00:33:00.924114189 +0000 UTC m=+1595.097820938" lastFinishedPulling="2026-03-10 00:33:24.159020521 +0000 UTC m=+1618.332727280" observedRunningTime="2026-03-10 00:33:24.541318578 +0000 UTC m=+1618.715025407" watchObservedRunningTime="2026-03-10 00:33:24.550563684 +0000 UTC m=+1618.724270473" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.745828 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kmjj2"] Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.747618 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.750110 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.750319 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-ngmz6" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.750483 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.750569 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.752089 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.752511 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.757234 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.765516 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kmjj2"] Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852621 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rdzr\" (UniqueName: \"kubernetes.io/projected/bd538cc5-49ab-4de4-b202-9068ffe969df-kube-api-access-5rdzr\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852667 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-users\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852693 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852723 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852745 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852789 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-config\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852835 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954338 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954410 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rdzr\" (UniqueName: \"kubernetes.io/projected/bd538cc5-49ab-4de4-b202-9068ffe969df-kube-api-access-5rdzr\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954437 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-users\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954466 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954500 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954526 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954576 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-config\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.961529 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.962241 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.962373 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-users\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.962405 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.968325 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-config\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.989413 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.993010 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rdzr\" (UniqueName: \"kubernetes.io/projected/bd538cc5-49ab-4de4-b202-9068ffe969df-kube-api-access-5rdzr\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:48 crc kubenswrapper[4994]: I0310 00:33:48.067618 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:48 crc kubenswrapper[4994]: I0310 00:33:48.265596 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kmjj2"] Mar 10 00:33:48 crc kubenswrapper[4994]: W0310 00:33:48.273016 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd538cc5_49ab_4de4_b202_9068ffe969df.slice/crio-a70d05716712507b24f298fa27a033f55a9aa604b32f2cbef81d39d3a28695ec WatchSource:0}: Error finding container a70d05716712507b24f298fa27a033f55a9aa604b32f2cbef81d39d3a28695ec: Status 404 returned error can't find the container with id a70d05716712507b24f298fa27a033f55a9aa604b32f2cbef81d39d3a28695ec Mar 10 00:33:48 crc kubenswrapper[4994]: I0310 00:33:48.735750 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" event={"ID":"bd538cc5-49ab-4de4-b202-9068ffe969df","Type":"ContainerStarted","Data":"a70d05716712507b24f298fa27a033f55a9aa604b32f2cbef81d39d3a28695ec"} Mar 10 00:33:53 crc kubenswrapper[4994]: I0310 00:33:53.780084 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" event={"ID":"bd538cc5-49ab-4de4-b202-9068ffe969df","Type":"ContainerStarted","Data":"b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186"} Mar 10 00:33:53 crc kubenswrapper[4994]: I0310 00:33:53.803898 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" podStartSLOduration=1.7704750150000002 podStartE2EDuration="6.803866967s" podCreationTimestamp="2026-03-10 00:33:47 +0000 UTC" firstStartedPulling="2026-03-10 00:33:48.274469637 +0000 UTC m=+1642.448176386" lastFinishedPulling="2026-03-10 00:33:53.307861589 +0000 UTC m=+1647.481568338" observedRunningTime="2026-03-10 00:33:53.798571022 +0000 UTC m=+1647.972277841" watchObservedRunningTime="2026-03-10 00:33:53.803866967 +0000 UTC m=+1647.977573706" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.723588 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.728522 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.730834 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.731236 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.731488 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-8q24q" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.731555 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.731723 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.731917 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.732323 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.732362 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.732413 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.732436 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.855094 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.903532 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.903851 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69942722-a3c1-459b-96d3-260e0813093b-config-out\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904006 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-config\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904131 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904213 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-web-config\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904293 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904366 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904447 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69942722-a3c1-459b-96d3-260e0813093b-tls-assets\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904520 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l68ww\" (UniqueName: \"kubernetes.io/projected/69942722-a3c1-459b-96d3-260e0813093b-kube-api-access-l68ww\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904592 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904683 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904754 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005453 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-web-config\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005524 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005556 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005598 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69942722-a3c1-459b-96d3-260e0813093b-tls-assets\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005624 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l68ww\" (UniqueName: \"kubernetes.io/projected/69942722-a3c1-459b-96d3-260e0813093b-kube-api-access-l68ww\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005657 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005707 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005738 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005781 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005812 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69942722-a3c1-459b-96d3-260e0813093b-config-out\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005841 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-config\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.006020 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.007354 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: E0310 00:33:58.009007 4994 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 10 00:33:58 crc kubenswrapper[4994]: E0310 00:33:58.009238 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-prometheus-proxy-tls podName:69942722-a3c1-459b-96d3-260e0813093b nodeName:}" failed. No retries permitted until 2026-03-10 00:33:58.509212845 +0000 UTC m=+1652.682919614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "69942722-a3c1-459b-96d3-260e0813093b") : secret "default-prometheus-proxy-tls" not found Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.009833 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.010741 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.010924 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.014180 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69942722-a3c1-459b-96d3-260e0813093b-config-out\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.014423 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.015565 4994 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.015723 4994 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3bf73a2b0d38f9974f6fad8c70186349ac24419a9badade344c26d83828013f9/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.019536 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-config\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.019791 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-web-config\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.019937 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69942722-a3c1-459b-96d3-260e0813093b-tls-assets\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.052355 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l68ww\" (UniqueName: \"kubernetes.io/projected/69942722-a3c1-459b-96d3-260e0813093b-kube-api-access-l68ww\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.058099 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.512292 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.516436 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.659391 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.876797 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 10 00:33:59 crc kubenswrapper[4994]: I0310 00:33:59.847093 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"69942722-a3c1-459b-96d3-260e0813093b","Type":"ContainerStarted","Data":"f98cbb4276c3f97fb9ef091fa2e193efa3a690a6754100283e460849e17ceb76"} Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.122306 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551714-s79ft"] Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.127904 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.129741 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.130209 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.131289 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.135260 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551714-s79ft"] Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.234935 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-549f4\" (UniqueName: \"kubernetes.io/projected/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c-kube-api-access-549f4\") pod \"auto-csr-approver-29551714-s79ft\" (UID: \"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c\") " pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.336815 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-549f4\" (UniqueName: \"kubernetes.io/projected/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c-kube-api-access-549f4\") pod \"auto-csr-approver-29551714-s79ft\" (UID: \"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c\") " pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.359902 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-549f4\" (UniqueName: \"kubernetes.io/projected/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c-kube-api-access-549f4\") pod \"auto-csr-approver-29551714-s79ft\" (UID: \"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c\") " pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.450821 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:01 crc kubenswrapper[4994]: I0310 00:34:01.252449 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551714-s79ft"] Mar 10 00:34:01 crc kubenswrapper[4994]: I0310 00:34:01.863366 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551714-s79ft" event={"ID":"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c","Type":"ContainerStarted","Data":"844f968fc65a43a5fe0017d514fe8a3ca83ad89726d636eec777ca0d9738de21"} Mar 10 00:34:03 crc kubenswrapper[4994]: I0310 00:34:03.884856 4994 generic.go:334] "Generic (PLEG): container finished" podID="3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c" containerID="55ae1bc05b680756a0fab6fc454424e48677cf98e3af7624cd80e10e8ec94e10" exitCode=0 Mar 10 00:34:03 crc kubenswrapper[4994]: I0310 00:34:03.884973 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551714-s79ft" event={"ID":"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c","Type":"ContainerDied","Data":"55ae1bc05b680756a0fab6fc454424e48677cf98e3af7624cd80e10e8ec94e10"} Mar 10 00:34:03 crc kubenswrapper[4994]: I0310 00:34:03.888398 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"69942722-a3c1-459b-96d3-260e0813093b","Type":"ContainerStarted","Data":"3573b3eabc534c0ae6632ceb8b4c59c0de6af63257f8b399b8492eafdae4de23"} Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.270715 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.409542 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-549f4\" (UniqueName: \"kubernetes.io/projected/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c-kube-api-access-549f4\") pod \"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c\" (UID: \"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c\") " Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.421016 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c-kube-api-access-549f4" (OuterVolumeSpecName: "kube-api-access-549f4") pod "3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c" (UID: "3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c"). InnerVolumeSpecName "kube-api-access-549f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.511768 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-549f4\" (UniqueName: \"kubernetes.io/projected/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c-kube-api-access-549f4\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.909417 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551714-s79ft" event={"ID":"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c","Type":"ContainerDied","Data":"844f968fc65a43a5fe0017d514fe8a3ca83ad89726d636eec777ca0d9738de21"} Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.909481 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="844f968fc65a43a5fe0017d514fe8a3ca83ad89726d636eec777ca0d9738de21" Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.909496 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:06 crc kubenswrapper[4994]: I0310 00:34:06.347688 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551708-mcgcl"] Mar 10 00:34:06 crc kubenswrapper[4994]: I0310 00:34:06.357437 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551708-mcgcl"] Mar 10 00:34:06 crc kubenswrapper[4994]: I0310 00:34:06.566718 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79b6ae72-9c1a-4191-84af-d06b0155e244" path="/var/lib/kubelet/pods/79b6ae72-9c1a-4191-84af-d06b0155e244/volumes" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.494953 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-fzqv5"] Mar 10 00:34:07 crc kubenswrapper[4994]: E0310 00:34:07.495239 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c" containerName="oc" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.495255 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c" containerName="oc" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.495385 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c" containerName="oc" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.495797 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.555804 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-fzqv5"] Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.640290 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5njjn\" (UniqueName: \"kubernetes.io/projected/8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b-kube-api-access-5njjn\") pod \"default-snmp-webhook-6856cfb745-fzqv5\" (UID: \"8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.741349 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5njjn\" (UniqueName: \"kubernetes.io/projected/8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b-kube-api-access-5njjn\") pod \"default-snmp-webhook-6856cfb745-fzqv5\" (UID: \"8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.761674 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5njjn\" (UniqueName: \"kubernetes.io/projected/8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b-kube-api-access-5njjn\") pod \"default-snmp-webhook-6856cfb745-fzqv5\" (UID: \"8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.861486 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" Mar 10 00:34:08 crc kubenswrapper[4994]: I0310 00:34:08.110321 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-fzqv5"] Mar 10 00:34:08 crc kubenswrapper[4994]: W0310 00:34:08.126756 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e6cb6c2_b4dc_41ad_83dc_63de94ec3b6b.slice/crio-266f1d0d272a751363a358ab288f8d8806ffcd76fb0de433b496be8b9125c9b3 WatchSource:0}: Error finding container 266f1d0d272a751363a358ab288f8d8806ffcd76fb0de433b496be8b9125c9b3: Status 404 returned error can't find the container with id 266f1d0d272a751363a358ab288f8d8806ffcd76fb0de433b496be8b9125c9b3 Mar 10 00:34:08 crc kubenswrapper[4994]: I0310 00:34:08.931118 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" event={"ID":"8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b","Type":"ContainerStarted","Data":"266f1d0d272a751363a358ab288f8d8806ffcd76fb0de433b496be8b9125c9b3"} Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.874911 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.876718 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.882588 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.882713 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.882750 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-djsrm" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.882782 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.882788 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.886171 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.892540 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.945088 4994 generic.go:334] "Generic (PLEG): container finished" podID="69942722-a3c1-459b-96d3-260e0813093b" containerID="3573b3eabc534c0ae6632ceb8b4c59c0de6af63257f8b399b8492eafdae4de23" exitCode=0 Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.945125 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"69942722-a3c1-459b-96d3-260e0813093b","Type":"ContainerDied","Data":"3573b3eabc534c0ae6632ceb8b4c59c0de6af63257f8b399b8492eafdae4de23"} Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996623 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bhsr\" (UniqueName: \"kubernetes.io/projected/bd991a1f-d471-40c4-919f-75400e047b5d-kube-api-access-9bhsr\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996692 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996712 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996731 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996771 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd991a1f-d471-40c4-919f-75400e047b5d-config-out\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996793 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-367f0112-d331-46f6-9afc-7958b67f370c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-367f0112-d331-46f6-9afc-7958b67f370c\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996811 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd991a1f-d471-40c4-919f-75400e047b5d-tls-assets\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996884 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-web-config\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996915 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-config-volume\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098646 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bhsr\" (UniqueName: \"kubernetes.io/projected/bd991a1f-d471-40c4-919f-75400e047b5d-kube-api-access-9bhsr\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098709 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098739 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098769 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098816 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd991a1f-d471-40c4-919f-75400e047b5d-config-out\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098847 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-367f0112-d331-46f6-9afc-7958b67f370c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-367f0112-d331-46f6-9afc-7958b67f370c\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098886 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd991a1f-d471-40c4-919f-75400e047b5d-tls-assets\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098940 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-web-config\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098970 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-config-volume\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: E0310 00:34:11.099045 4994 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 10 00:34:11 crc kubenswrapper[4994]: E0310 00:34:11.099148 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls podName:bd991a1f-d471-40c4-919f-75400e047b5d nodeName:}" failed. No retries permitted until 2026-03-10 00:34:11.599127054 +0000 UTC m=+1665.772833803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "bd991a1f-d471-40c4-919f-75400e047b5d") : secret "default-alertmanager-proxy-tls" not found Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.104762 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd991a1f-d471-40c4-919f-75400e047b5d-config-out\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.105042 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd991a1f-d471-40c4-919f-75400e047b5d-tls-assets\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.105181 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-web-config\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.105414 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.108446 4994 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.108535 4994 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-367f0112-d331-46f6-9afc-7958b67f370c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-367f0112-d331-46f6-9afc-7958b67f370c\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/70776db84b8166f32938d777b455eaeb195500fe20856cd31d03188fb0ad0492/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.115108 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-config-volume\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.115970 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.123943 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bhsr\" (UniqueName: \"kubernetes.io/projected/bd991a1f-d471-40c4-919f-75400e047b5d-kube-api-access-9bhsr\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.134184 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-367f0112-d331-46f6-9afc-7958b67f370c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-367f0112-d331-46f6-9afc-7958b67f370c\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.605097 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: E0310 00:34:11.606315 4994 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 10 00:34:11 crc kubenswrapper[4994]: E0310 00:34:11.606364 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls podName:bd991a1f-d471-40c4-919f-75400e047b5d nodeName:}" failed. No retries permitted until 2026-03-10 00:34:12.606349589 +0000 UTC m=+1666.780056338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "bd991a1f-d471-40c4-919f-75400e047b5d") : secret "default-alertmanager-proxy-tls" not found Mar 10 00:34:12 crc kubenswrapper[4994]: I0310 00:34:12.619269 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:12 crc kubenswrapper[4994]: E0310 00:34:12.619672 4994 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 10 00:34:12 crc kubenswrapper[4994]: E0310 00:34:12.619780 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls podName:bd991a1f-d471-40c4-919f-75400e047b5d nodeName:}" failed. No retries permitted until 2026-03-10 00:34:14.619750203 +0000 UTC m=+1668.793456982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "bd991a1f-d471-40c4-919f-75400e047b5d") : secret "default-alertmanager-proxy-tls" not found Mar 10 00:34:14 crc kubenswrapper[4994]: I0310 00:34:14.648499 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:14 crc kubenswrapper[4994]: I0310 00:34:14.655983 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:14 crc kubenswrapper[4994]: I0310 00:34:14.884988 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:15 crc kubenswrapper[4994]: I0310 00:34:15.866904 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 10 00:34:15 crc kubenswrapper[4994]: I0310 00:34:15.998102 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"bd991a1f-d471-40c4-919f-75400e047b5d","Type":"ContainerStarted","Data":"2ae45c0efca905bcdb1a1dbe21cf60c0378b7a9c5b331c7bee9fbabfa6bc01ce"} Mar 10 00:34:17 crc kubenswrapper[4994]: I0310 00:34:17.006072 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" event={"ID":"8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b","Type":"ContainerStarted","Data":"c5143e3e63c3da41e9465f29e6534abe1fb3747e2848cdb4618517be3fbdac84"} Mar 10 00:34:17 crc kubenswrapper[4994]: I0310 00:34:17.029704 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" podStartSLOduration=2.350913972 podStartE2EDuration="10.029683746s" podCreationTimestamp="2026-03-10 00:34:07 +0000 UTC" firstStartedPulling="2026-03-10 00:34:08.128848409 +0000 UTC m=+1662.302555158" lastFinishedPulling="2026-03-10 00:34:15.807618183 +0000 UTC m=+1669.981324932" observedRunningTime="2026-03-10 00:34:17.027194572 +0000 UTC m=+1671.200901341" watchObservedRunningTime="2026-03-10 00:34:17.029683746 +0000 UTC m=+1671.203390495" Mar 10 00:34:18 crc kubenswrapper[4994]: I0310 00:34:18.016277 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"bd991a1f-d471-40c4-919f-75400e047b5d","Type":"ContainerStarted","Data":"fd3d6b6bb77479ee7e5d2a48a2c1037c6305df2cbfacc9b60b82d0f19e9117de"} Mar 10 00:34:21 crc kubenswrapper[4994]: I0310 00:34:21.038881 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"69942722-a3c1-459b-96d3-260e0813093b","Type":"ContainerStarted","Data":"4c8dcbd81dbc3165ff83337d1eee59d8f9fba78c944471291cc50be2ea945974"} Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.949800 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl"] Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.960491 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.962593 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-xrmnc" Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.963558 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.963731 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.964009 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.993207 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl"] Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.058533 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"69942722-a3c1-459b-96d3-260e0813093b","Type":"ContainerStarted","Data":"7cfe95d3958f7e00f3e16968e8dbc37f4b37a628b58fbc719b73147dc3fa4d39"} Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.080301 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ba63838d-f012-4322-afa9-d46cb2387ae8-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.080522 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/ba63838d-f012-4322-afa9-d46cb2387ae8-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.080693 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.080827 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lnn6\" (UniqueName: \"kubernetes.io/projected/ba63838d-f012-4322-afa9-d46cb2387ae8-kube-api-access-6lnn6\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.080882 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.182704 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lnn6\" (UniqueName: \"kubernetes.io/projected/ba63838d-f012-4322-afa9-d46cb2387ae8-kube-api-access-6lnn6\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.182757 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.182786 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ba63838d-f012-4322-afa9-d46cb2387ae8-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.182857 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/ba63838d-f012-4322-afa9-d46cb2387ae8-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.182929 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: E0310 00:34:23.183075 4994 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 10 00:34:23 crc kubenswrapper[4994]: E0310 00:34:23.183150 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls podName:ba63838d-f012-4322-afa9-d46cb2387ae8 nodeName:}" failed. No retries permitted until 2026-03-10 00:34:23.683129858 +0000 UTC m=+1677.856836617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" (UID: "ba63838d-f012-4322-afa9-d46cb2387ae8") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.183405 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ba63838d-f012-4322-afa9-d46cb2387ae8-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.183902 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/ba63838d-f012-4322-afa9-d46cb2387ae8-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.198087 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.199660 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lnn6\" (UniqueName: \"kubernetes.io/projected/ba63838d-f012-4322-afa9-d46cb2387ae8-kube-api-access-6lnn6\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.688265 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: E0310 00:34:23.688402 4994 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 10 00:34:23 crc kubenswrapper[4994]: E0310 00:34:23.688450 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls podName:ba63838d-f012-4322-afa9-d46cb2387ae8 nodeName:}" failed. No retries permitted until 2026-03-10 00:34:24.688435374 +0000 UTC m=+1678.862142123 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" (UID: "ba63838d-f012-4322-afa9-d46cb2387ae8") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 10 00:34:24 crc kubenswrapper[4994]: I0310 00:34:24.701534 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:24 crc kubenswrapper[4994]: I0310 00:34:24.705944 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:24 crc kubenswrapper[4994]: I0310 00:34:24.788331 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.014424 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn"] Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.017055 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.020249 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.020506 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.031277 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn"] Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.083520 4994 generic.go:334] "Generic (PLEG): container finished" podID="bd991a1f-d471-40c4-919f-75400e047b5d" containerID="fd3d6b6bb77479ee7e5d2a48a2c1037c6305df2cbfacc9b60b82d0f19e9117de" exitCode=0 Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.083558 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"bd991a1f-d471-40c4-919f-75400e047b5d","Type":"ContainerDied","Data":"fd3d6b6bb77479ee7e5d2a48a2c1037c6305df2cbfacc9b60b82d0f19e9117de"} Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.120523 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6rvl\" (UniqueName: \"kubernetes.io/projected/71658d15-ee94-436a-8266-e6ef3680d0f0-kube-api-access-s6rvl\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.120633 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.120680 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.120724 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71658d15-ee94-436a-8266-e6ef3680d0f0-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.120744 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71658d15-ee94-436a-8266-e6ef3680d0f0-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.221689 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.221835 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.221890 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71658d15-ee94-436a-8266-e6ef3680d0f0-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.221934 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71658d15-ee94-436a-8266-e6ef3680d0f0-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.221968 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6rvl\" (UniqueName: \"kubernetes.io/projected/71658d15-ee94-436a-8266-e6ef3680d0f0-kube-api-access-s6rvl\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: E0310 00:34:26.222594 4994 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 10 00:34:26 crc kubenswrapper[4994]: E0310 00:34:26.222655 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls podName:71658d15-ee94-436a-8266-e6ef3680d0f0 nodeName:}" failed. No retries permitted until 2026-03-10 00:34:26.722636491 +0000 UTC m=+1680.896343350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" (UID: "71658d15-ee94-436a-8266-e6ef3680d0f0") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.223087 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71658d15-ee94-436a-8266-e6ef3680d0f0-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.223139 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71658d15-ee94-436a-8266-e6ef3680d0f0-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.242547 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.249653 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6rvl\" (UniqueName: \"kubernetes.io/projected/71658d15-ee94-436a-8266-e6ef3680d0f0-kube-api-access-s6rvl\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.729525 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: E0310 00:34:26.729674 4994 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 10 00:34:26 crc kubenswrapper[4994]: E0310 00:34:26.729934 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls podName:71658d15-ee94-436a-8266-e6ef3680d0f0 nodeName:}" failed. No retries permitted until 2026-03-10 00:34:27.729917409 +0000 UTC m=+1681.903624158 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" (UID: "71658d15-ee94-436a-8266-e6ef3680d0f0") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 10 00:34:27 crc kubenswrapper[4994]: I0310 00:34:27.745772 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:27 crc kubenswrapper[4994]: I0310 00:34:27.762781 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:27 crc kubenswrapper[4994]: I0310 00:34:27.833588 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.009105 4994 scope.go:117] "RemoveContainer" containerID="76e4652e3cbbbc7dd950c967558f6551927fe404fc62cca88c145579e5829da9" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.234364 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl"] Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.237029 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.239905 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.240309 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.243389 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl"] Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.269261 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzss5\" (UniqueName: \"kubernetes.io/projected/eb6db775-6577-4ad6-90db-1fffd09b924b-kube-api-access-lzss5\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.269339 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/eb6db775-6577-4ad6-90db-1fffd09b924b-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.269373 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.269512 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.269566 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb6db775-6577-4ad6-90db-1fffd09b924b-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.372810 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.372885 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb6db775-6577-4ad6-90db-1fffd09b924b-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.372919 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzss5\" (UniqueName: \"kubernetes.io/projected/eb6db775-6577-4ad6-90db-1fffd09b924b-kube-api-access-lzss5\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.372953 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/eb6db775-6577-4ad6-90db-1fffd09b924b-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.372979 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: E0310 00:34:29.373108 4994 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 10 00:34:29 crc kubenswrapper[4994]: E0310 00:34:29.373160 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls podName:eb6db775-6577-4ad6-90db-1fffd09b924b nodeName:}" failed. No retries permitted until 2026-03-10 00:34:29.873142247 +0000 UTC m=+1684.046848996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" (UID: "eb6db775-6577-4ad6-90db-1fffd09b924b") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.377476 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.377739 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb6db775-6577-4ad6-90db-1fffd09b924b-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.378494 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/eb6db775-6577-4ad6-90db-1fffd09b924b-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.393506 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzss5\" (UniqueName: \"kubernetes.io/projected/eb6db775-6577-4ad6-90db-1fffd09b924b-kube-api-access-lzss5\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.878127 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: E0310 00:34:29.878378 4994 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 10 00:34:29 crc kubenswrapper[4994]: E0310 00:34:29.878486 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls podName:eb6db775-6577-4ad6-90db-1fffd09b924b nodeName:}" failed. No retries permitted until 2026-03-10 00:34:30.878456263 +0000 UTC m=+1685.052163042 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" (UID: "eb6db775-6577-4ad6-90db-1fffd09b924b") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 10 00:34:30 crc kubenswrapper[4994]: I0310 00:34:30.897722 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:30 crc kubenswrapper[4994]: I0310 00:34:30.914358 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:31 crc kubenswrapper[4994]: I0310 00:34:31.056015 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.061420 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl"] Mar 10 00:34:33 crc kubenswrapper[4994]: W0310 00:34:33.069255 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb6db775_6577_4ad6_90db_1fffd09b924b.slice/crio-25642a0d6375eeeb023db7e18c0e5dcfe4b7d2646edd565df60dde9ae75c570d WatchSource:0}: Error finding container 25642a0d6375eeeb023db7e18c0e5dcfe4b7d2646edd565df60dde9ae75c570d: Status 404 returned error can't find the container with id 25642a0d6375eeeb023db7e18c0e5dcfe4b7d2646edd565df60dde9ae75c570d Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.113203 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl"] Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.125253 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn"] Mar 10 00:34:33 crc kubenswrapper[4994]: W0310 00:34:33.140841 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71658d15_ee94_436a_8266_e6ef3680d0f0.slice/crio-4970eb399bb870387b2d70df83ed093674b56d4d7f7d107ec1d6c7d1465cf6d6 WatchSource:0}: Error finding container 4970eb399bb870387b2d70df83ed093674b56d4d7f7d107ec1d6c7d1465cf6d6: Status 404 returned error can't find the container with id 4970eb399bb870387b2d70df83ed093674b56d4d7f7d107ec1d6c7d1465cf6d6 Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.149518 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"69942722-a3c1-459b-96d3-260e0813093b","Type":"ContainerStarted","Data":"cc76e7036a4a1e7a0ec7bfa5f38f0377cd47c300fdae995408476050e36061f4"} Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.152627 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerStarted","Data":"25642a0d6375eeeb023db7e18c0e5dcfe4b7d2646edd565df60dde9ae75c570d"} Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.172357 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=3.37955234 podStartE2EDuration="37.172334315s" podCreationTimestamp="2026-03-10 00:33:56 +0000 UTC" firstStartedPulling="2026-03-10 00:33:58.892226944 +0000 UTC m=+1653.065933693" lastFinishedPulling="2026-03-10 00:34:32.685008919 +0000 UTC m=+1686.858715668" observedRunningTime="2026-03-10 00:34:33.168156538 +0000 UTC m=+1687.341863307" watchObservedRunningTime="2026-03-10 00:34:33.172334315 +0000 UTC m=+1687.346041084" Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.660120 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Mar 10 00:34:34 crc kubenswrapper[4994]: I0310 00:34:34.159805 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerStarted","Data":"b5ec25fcda84df8c51f85ac77064fcfb723d1c6b168a72edeed43f24a89f1f4f"} Mar 10 00:34:34 crc kubenswrapper[4994]: I0310 00:34:34.161404 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerStarted","Data":"4970eb399bb870387b2d70df83ed093674b56d4d7f7d107ec1d6c7d1465cf6d6"} Mar 10 00:34:37 crc kubenswrapper[4994]: I0310 00:34:37.225698 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerStarted","Data":"2ffe9e746d1f5e3dc40721ca6b65cac0335f05a6b67509907e9c69f144f050aa"} Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.239769 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerStarted","Data":"53f62f7c35abd11f85b8e6021296d4d751874456f6783b02d540640a361fb444"} Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.240153 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerStarted","Data":"4d5e4ee3ba96e2dd90d73b0be0ac50c0ec3a639d8e271a43ada0f3e3ebd65d64"} Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.241948 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerStarted","Data":"f03960de7e0eee16beff7087a2835cba9b98d37f674a392e4a72e7b02877ae4c"} Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.244430 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"bd991a1f-d471-40c4-919f-75400e047b5d","Type":"ContainerStarted","Data":"a013a4322541af7a724ffa0f8a8b9913fc8612cfdff1a246b94f3311c0720c6f"} Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.406729 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k"] Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.408573 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.412292 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.412520 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.419722 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k"] Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.520201 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/945b9546-8486-448b-a7b8-ec76634ff030-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.520280 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws2sp\" (UniqueName: \"kubernetes.io/projected/945b9546-8486-448b-a7b8-ec76634ff030-kube-api-access-ws2sp\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.520318 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/945b9546-8486-448b-a7b8-ec76634ff030-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.520341 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/945b9546-8486-448b-a7b8-ec76634ff030-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.623146 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/945b9546-8486-448b-a7b8-ec76634ff030-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.623205 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/945b9546-8486-448b-a7b8-ec76634ff030-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.623824 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/945b9546-8486-448b-a7b8-ec76634ff030-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.623924 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/945b9546-8486-448b-a7b8-ec76634ff030-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.624709 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/945b9546-8486-448b-a7b8-ec76634ff030-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.624855 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws2sp\" (UniqueName: \"kubernetes.io/projected/945b9546-8486-448b-a7b8-ec76634ff030-kube-api-access-ws2sp\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.750371 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/945b9546-8486-448b-a7b8-ec76634ff030-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.750380 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws2sp\" (UniqueName: \"kubernetes.io/projected/945b9546-8486-448b-a7b8-ec76634ff030-kube-api-access-ws2sp\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.029087 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.270320 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerStarted","Data":"e6068625af9c2045ef46c0df0c506de4bbc03d0c2d17fa5809ae88cbfbacc695"} Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.273845 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerStarted","Data":"8b70a93015d9609790745f53b72f74b1cd43d650efee4dc67311f62ad75287ab"} Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.293327 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"bd991a1f-d471-40c4-919f-75400e047b5d","Type":"ContainerStarted","Data":"082a50713611f09a394395ce180245b195a8de5faa73b5061d20daaf3a5a4410"} Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.450468 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k"] Mar 10 00:34:39 crc kubenswrapper[4994]: W0310 00:34:39.484522 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod945b9546_8486_448b_a7b8_ec76634ff030.slice/crio-c5eb670c43ca30ca91bc7d92b6a0fa1a3f43f65a23770ef94e5f5e0b400cf662 WatchSource:0}: Error finding container c5eb670c43ca30ca91bc7d92b6a0fa1a3f43f65a23770ef94e5f5e0b400cf662: Status 404 returned error can't find the container with id c5eb670c43ca30ca91bc7d92b6a0fa1a3f43f65a23770ef94e5f5e0b400cf662 Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.624833 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6"] Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.625792 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.630772 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.640345 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6"] Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.741628 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c50e95d0-ef8f-4355-ba85-156de14a4408-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.741687 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c50e95d0-ef8f-4355-ba85-156de14a4408-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.741713 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c50e95d0-ef8f-4355-ba85-156de14a4408-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.741773 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chb9t\" (UniqueName: \"kubernetes.io/projected/c50e95d0-ef8f-4355-ba85-156de14a4408-kube-api-access-chb9t\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.842538 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chb9t\" (UniqueName: \"kubernetes.io/projected/c50e95d0-ef8f-4355-ba85-156de14a4408-kube-api-access-chb9t\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.842602 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c50e95d0-ef8f-4355-ba85-156de14a4408-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.842634 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c50e95d0-ef8f-4355-ba85-156de14a4408-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.842662 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c50e95d0-ef8f-4355-ba85-156de14a4408-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.843162 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c50e95d0-ef8f-4355-ba85-156de14a4408-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.843618 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c50e95d0-ef8f-4355-ba85-156de14a4408-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.852555 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c50e95d0-ef8f-4355-ba85-156de14a4408-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.858144 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chb9t\" (UniqueName: \"kubernetes.io/projected/c50e95d0-ef8f-4355-ba85-156de14a4408-kube-api-access-chb9t\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.964354 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:40 crc kubenswrapper[4994]: I0310 00:34:40.303094 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerStarted","Data":"c84531a4e93d62654ebb196f19936778850438953ec8673d6fafbb0edd8785a6"} Mar 10 00:34:40 crc kubenswrapper[4994]: I0310 00:34:40.303373 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerStarted","Data":"c5eb670c43ca30ca91bc7d92b6a0fa1a3f43f65a23770ef94e5f5e0b400cf662"} Mar 10 00:34:40 crc kubenswrapper[4994]: I0310 00:34:40.308288 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"bd991a1f-d471-40c4-919f-75400e047b5d","Type":"ContainerStarted","Data":"03523a8a2f91cfcd77bb522b38e12cb89e802728e9024e8ff4d7271b4af233b2"} Mar 10 00:34:40 crc kubenswrapper[4994]: I0310 00:34:40.329241 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=18.009505292 podStartE2EDuration="31.329172167s" podCreationTimestamp="2026-03-10 00:34:09 +0000 UTC" firstStartedPulling="2026-03-10 00:34:26.086944871 +0000 UTC m=+1680.260651620" lastFinishedPulling="2026-03-10 00:34:39.406611746 +0000 UTC m=+1693.580318495" observedRunningTime="2026-03-10 00:34:40.325155954 +0000 UTC m=+1694.498862703" watchObservedRunningTime="2026-03-10 00:34:40.329172167 +0000 UTC m=+1694.502878916" Mar 10 00:34:43 crc kubenswrapper[4994]: I0310 00:34:43.660184 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 10 00:34:43 crc kubenswrapper[4994]: I0310 00:34:43.712865 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 10 00:34:44 crc kubenswrapper[4994]: I0310 00:34:44.423601 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 10 00:34:48 crc kubenswrapper[4994]: I0310 00:34:48.892544 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:34:48 crc kubenswrapper[4994]: I0310 00:34:48.898223 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:34:49 crc kubenswrapper[4994]: W0310 00:34:49.792350 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc50e95d0_ef8f_4355_ba85_156de14a4408.slice/crio-1a403c668e4a687bbc082011ed98891ea63192f1942f116c4f98c07a98b9f683 WatchSource:0}: Error finding container 1a403c668e4a687bbc082011ed98891ea63192f1942f116c4f98c07a98b9f683: Status 404 returned error can't find the container with id 1a403c668e4a687bbc082011ed98891ea63192f1942f116c4f98c07a98b9f683 Mar 10 00:34:49 crc kubenswrapper[4994]: I0310 00:34:49.792389 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6"] Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.415200 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerStarted","Data":"fe370f35fd4a308131e3c1f4033bcafe043e97cb9c38019ee3088a1c73df5b11"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.417705 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerStarted","Data":"013632f070d3c54dbd8158d7b24414c731cd77d54e78eb1de5beab4bf18b85d1"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.420093 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerStarted","Data":"390f403d95c0494514ef6579e3128b59e525fce1f8ee3b60486bcfe8ee30637b"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.422625 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerStarted","Data":"3c67a501c42f43ed2776494dabfbacee8dfff7375a998da6a3e29927d0ddb447"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.424581 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerStarted","Data":"5737cbac03bf4951a511b42d7dfd30e6838483f53c97da7a1ed71db32c21afa1"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.424617 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerStarted","Data":"9acae11a5f98bb0e9dd9df4d9e9005d260dc85f7f3616aa89360f47e34d6bf50"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.424651 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerStarted","Data":"1a403c668e4a687bbc082011ed98891ea63192f1942f116c4f98c07a98b9f683"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.436715 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" podStartSLOduration=12.138287996 podStartE2EDuration="28.436692842s" podCreationTimestamp="2026-03-10 00:34:22 +0000 UTC" firstStartedPulling="2026-03-10 00:34:33.139929476 +0000 UTC m=+1687.313636225" lastFinishedPulling="2026-03-10 00:34:49.438334322 +0000 UTC m=+1703.612041071" observedRunningTime="2026-03-10 00:34:50.431378036 +0000 UTC m=+1704.605084805" watchObservedRunningTime="2026-03-10 00:34:50.436692842 +0000 UTC m=+1704.610399601" Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.478680 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" podStartSLOduration=11.136160743 podStartE2EDuration="11.478658675s" podCreationTimestamp="2026-03-10 00:34:39 +0000 UTC" firstStartedPulling="2026-03-10 00:34:49.795002106 +0000 UTC m=+1703.968708855" lastFinishedPulling="2026-03-10 00:34:50.137500038 +0000 UTC m=+1704.311206787" observedRunningTime="2026-03-10 00:34:50.477360971 +0000 UTC m=+1704.651067720" watchObservedRunningTime="2026-03-10 00:34:50.478658675 +0000 UTC m=+1704.652365424" Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.482967 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" podStartSLOduration=2.544697501 podStartE2EDuration="12.482959655s" podCreationTimestamp="2026-03-10 00:34:38 +0000 UTC" firstStartedPulling="2026-03-10 00:34:39.500041507 +0000 UTC m=+1693.673748266" lastFinishedPulling="2026-03-10 00:34:49.438303661 +0000 UTC m=+1703.612010420" observedRunningTime="2026-03-10 00:34:50.458710395 +0000 UTC m=+1704.632417154" watchObservedRunningTime="2026-03-10 00:34:50.482959655 +0000 UTC m=+1704.656666404" Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.501016 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" podStartSLOduration=9.169570077 podStartE2EDuration="25.500995397s" podCreationTimestamp="2026-03-10 00:34:25 +0000 UTC" firstStartedPulling="2026-03-10 00:34:33.148566767 +0000 UTC m=+1687.322273516" lastFinishedPulling="2026-03-10 00:34:49.479992087 +0000 UTC m=+1703.653698836" observedRunningTime="2026-03-10 00:34:50.495504676 +0000 UTC m=+1704.669211425" watchObservedRunningTime="2026-03-10 00:34:50.500995397 +0000 UTC m=+1704.674702136" Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.525672 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" podStartSLOduration=5.196874735 podStartE2EDuration="21.525655228s" podCreationTimestamp="2026-03-10 00:34:29 +0000 UTC" firstStartedPulling="2026-03-10 00:34:33.072396758 +0000 UTC m=+1687.246103507" lastFinishedPulling="2026-03-10 00:34:49.401177251 +0000 UTC m=+1703.574884000" observedRunningTime="2026-03-10 00:34:50.525523633 +0000 UTC m=+1704.699230422" watchObservedRunningTime="2026-03-10 00:34:50.525655228 +0000 UTC m=+1704.699361977" Mar 10 00:34:56 crc kubenswrapper[4994]: I0310 00:34:56.240159 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kmjj2"] Mar 10 00:34:56 crc kubenswrapper[4994]: I0310 00:34:56.240730 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" podUID="bd538cc5-49ab-4de4-b202-9068ffe969df" containerName="default-interconnect" containerID="cri-o://b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186" gracePeriod=30 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.117762 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.219768 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-config\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.219897 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-users\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.219919 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-ca\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.219942 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-credentials\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.219969 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-ca\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.219993 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-credentials\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.220731 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rdzr\" (UniqueName: \"kubernetes.io/projected/bd538cc5-49ab-4de4-b202-9068ffe969df-kube-api-access-5rdzr\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.220986 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.225284 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.226009 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.242483 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.242587 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.242640 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.243074 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd538cc5-49ab-4de4-b202-9068ffe969df-kube-api-access-5rdzr" (OuterVolumeSpecName: "kube-api-access-5rdzr") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "kube-api-access-5rdzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322729 4994 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322767 4994 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322778 4994 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322791 4994 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322803 4994 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322812 4994 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322824 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rdzr\" (UniqueName: \"kubernetes.io/projected/bd538cc5-49ab-4de4-b202-9068ffe969df-kube-api-access-5rdzr\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.347041 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2p86s"] Mar 10 00:34:57 crc kubenswrapper[4994]: E0310 00:34:57.347468 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd538cc5-49ab-4de4-b202-9068ffe969df" containerName="default-interconnect" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.347484 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd538cc5-49ab-4de4-b202-9068ffe969df" containerName="default-interconnect" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.347593 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd538cc5-49ab-4de4-b202-9068ffe969df" containerName="default-interconnect" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.348009 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.364149 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2p86s"] Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.424660 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.424731 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-sasl-users\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.424755 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-sasl-config\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.424792 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.424832 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.425101 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.425233 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p4k9\" (UniqueName: \"kubernetes.io/projected/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-kube-api-access-7p4k9\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.468846 4994 generic.go:334] "Generic (PLEG): container finished" podID="c50e95d0-ef8f-4355-ba85-156de14a4408" containerID="9acae11a5f98bb0e9dd9df4d9e9005d260dc85f7f3616aa89360f47e34d6bf50" exitCode=0 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.468902 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerDied","Data":"9acae11a5f98bb0e9dd9df4d9e9005d260dc85f7f3616aa89360f47e34d6bf50"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.469427 4994 scope.go:117] "RemoveContainer" containerID="9acae11a5f98bb0e9dd9df4d9e9005d260dc85f7f3616aa89360f47e34d6bf50" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.473378 4994 generic.go:334] "Generic (PLEG): container finished" podID="bd538cc5-49ab-4de4-b202-9068ffe969df" containerID="b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186" exitCode=0 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.473442 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.473446 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" event={"ID":"bd538cc5-49ab-4de4-b202-9068ffe969df","Type":"ContainerDied","Data":"b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.473578 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" event={"ID":"bd538cc5-49ab-4de4-b202-9068ffe969df","Type":"ContainerDied","Data":"a70d05716712507b24f298fa27a033f55a9aa604b32f2cbef81d39d3a28695ec"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.473625 4994 scope.go:117] "RemoveContainer" containerID="b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.476361 4994 generic.go:334] "Generic (PLEG): container finished" podID="ba63838d-f012-4322-afa9-d46cb2387ae8" containerID="e6068625af9c2045ef46c0df0c506de4bbc03d0c2d17fa5809ae88cbfbacc695" exitCode=0 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.476420 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerDied","Data":"e6068625af9c2045ef46c0df0c506de4bbc03d0c2d17fa5809ae88cbfbacc695"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.476930 4994 scope.go:117] "RemoveContainer" containerID="e6068625af9c2045ef46c0df0c506de4bbc03d0c2d17fa5809ae88cbfbacc695" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.493527 4994 generic.go:334] "Generic (PLEG): container finished" podID="71658d15-ee94-436a-8266-e6ef3680d0f0" containerID="4d5e4ee3ba96e2dd90d73b0be0ac50c0ec3a639d8e271a43ada0f3e3ebd65d64" exitCode=0 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.493582 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerDied","Data":"4d5e4ee3ba96e2dd90d73b0be0ac50c0ec3a639d8e271a43ada0f3e3ebd65d64"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.495071 4994 scope.go:117] "RemoveContainer" containerID="4d5e4ee3ba96e2dd90d73b0be0ac50c0ec3a639d8e271a43ada0f3e3ebd65d64" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.495127 4994 generic.go:334] "Generic (PLEG): container finished" podID="945b9546-8486-448b-a7b8-ec76634ff030" containerID="c84531a4e93d62654ebb196f19936778850438953ec8673d6fafbb0edd8785a6" exitCode=0 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.495175 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerDied","Data":"c84531a4e93d62654ebb196f19936778850438953ec8673d6fafbb0edd8785a6"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.495604 4994 scope.go:117] "RemoveContainer" containerID="c84531a4e93d62654ebb196f19936778850438953ec8673d6fafbb0edd8785a6" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.505251 4994 scope.go:117] "RemoveContainer" containerID="b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.509421 4994 generic.go:334] "Generic (PLEG): container finished" podID="eb6db775-6577-4ad6-90db-1fffd09b924b" containerID="8b70a93015d9609790745f53b72f74b1cd43d650efee4dc67311f62ad75287ab" exitCode=0 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.509450 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerDied","Data":"8b70a93015d9609790745f53b72f74b1cd43d650efee4dc67311f62ad75287ab"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.509780 4994 scope.go:117] "RemoveContainer" containerID="8b70a93015d9609790745f53b72f74b1cd43d650efee4dc67311f62ad75287ab" Mar 10 00:34:57 crc kubenswrapper[4994]: E0310 00:34:57.509982 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186\": container with ID starting with b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186 not found: ID does not exist" containerID="b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.510027 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186"} err="failed to get container status \"b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186\": rpc error: code = NotFound desc = could not find container \"b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186\": container with ID starting with b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186 not found: ID does not exist" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.524730 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kmjj2"] Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.527038 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p4k9\" (UniqueName: \"kubernetes.io/projected/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-kube-api-access-7p4k9\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.527120 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.527140 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-sasl-users\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.527154 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-sasl-config\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.528049 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.530650 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kmjj2"] Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.531102 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-sasl-config\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.531371 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.531761 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.536519 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.543659 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-sasl-users\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.548573 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.549030 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.561054 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.563813 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p4k9\" (UniqueName: \"kubernetes.io/projected/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-kube-api-access-7p4k9\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.669498 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.964465 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2p86s"] Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.521565 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerStarted","Data":"6d58b9675a43308db681361c0bef9b74160a0c0119ed01f9c003d6f031166ed0"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.529411 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerStarted","Data":"169ddcb6a7b00ea999678151450f0db053d745d229a33e2dd89700bc08cf9670"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.537428 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerStarted","Data":"fdf59e41141258a1b8472c2d3611bcc044fc9d05558c10266949e1896f729f90"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.543566 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-2p86s" event={"ID":"732a0e77-64b0-4fd5-a2b8-05cb2f80de80","Type":"ContainerStarted","Data":"de2b49af68890bf8c0ee072ca9bdd53673b16a063f8cf78488b2543c00a5e5ed"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.543598 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-2p86s" event={"ID":"732a0e77-64b0-4fd5-a2b8-05cb2f80de80","Type":"ContainerStarted","Data":"426ab4b74ec294b316f67ef3bb8f570755b1df95ac09ff80248990a8b46ed3ea"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.548578 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerStarted","Data":"aa86ab0a88d44c9621b517e017ee873893234cfa280d568b1f4abc59862b5cb2"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.552021 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerStarted","Data":"f7d20e00b2e0ce7b14482f30354361fd2e1ca73ef829a0328bc5831de70cc576"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.564513 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd538cc5-49ab-4de4-b202-9068ffe969df" path="/var/lib/kubelet/pods/bd538cc5-49ab-4de4-b202-9068ffe969df/volumes" Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.612672 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-2p86s" podStartSLOduration=2.612644263 podStartE2EDuration="2.612644263s" podCreationTimestamp="2026-03-10 00:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:34:58.591479052 +0000 UTC m=+1712.765185811" watchObservedRunningTime="2026-03-10 00:34:58.612644263 +0000 UTC m=+1712.786351022" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.570656 4994 generic.go:334] "Generic (PLEG): container finished" podID="71658d15-ee94-436a-8266-e6ef3680d0f0" containerID="fdf59e41141258a1b8472c2d3611bcc044fc9d05558c10266949e1896f729f90" exitCode=0 Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.571055 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerDied","Data":"fdf59e41141258a1b8472c2d3611bcc044fc9d05558c10266949e1896f729f90"} Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.571100 4994 scope.go:117] "RemoveContainer" containerID="4d5e4ee3ba96e2dd90d73b0be0ac50c0ec3a639d8e271a43ada0f3e3ebd65d64" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.571693 4994 scope.go:117] "RemoveContainer" containerID="fdf59e41141258a1b8472c2d3611bcc044fc9d05558c10266949e1896f729f90" Mar 10 00:34:59 crc kubenswrapper[4994]: E0310 00:34:59.571959 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn_service-telemetry(71658d15-ee94-436a-8266-e6ef3680d0f0)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" podUID="71658d15-ee94-436a-8266-e6ef3680d0f0" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.577208 4994 generic.go:334] "Generic (PLEG): container finished" podID="945b9546-8486-448b-a7b8-ec76634ff030" containerID="aa86ab0a88d44c9621b517e017ee873893234cfa280d568b1f4abc59862b5cb2" exitCode=0 Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.577281 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerDied","Data":"aa86ab0a88d44c9621b517e017ee873893234cfa280d568b1f4abc59862b5cb2"} Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.577822 4994 scope.go:117] "RemoveContainer" containerID="aa86ab0a88d44c9621b517e017ee873893234cfa280d568b1f4abc59862b5cb2" Mar 10 00:34:59 crc kubenswrapper[4994]: E0310 00:34:59.578076 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-b85695595-kd89k_service-telemetry(945b9546-8486-448b-a7b8-ec76634ff030)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" podUID="945b9546-8486-448b-a7b8-ec76634ff030" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.582308 4994 generic.go:334] "Generic (PLEG): container finished" podID="eb6db775-6577-4ad6-90db-1fffd09b924b" containerID="f7d20e00b2e0ce7b14482f30354361fd2e1ca73ef829a0328bc5831de70cc576" exitCode=0 Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.582363 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerDied","Data":"f7d20e00b2e0ce7b14482f30354361fd2e1ca73ef829a0328bc5831de70cc576"} Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.582791 4994 scope.go:117] "RemoveContainer" containerID="f7d20e00b2e0ce7b14482f30354361fd2e1ca73ef829a0328bc5831de70cc576" Mar 10 00:34:59 crc kubenswrapper[4994]: E0310 00:34:59.583151 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl_service-telemetry(eb6db775-6577-4ad6-90db-1fffd09b924b)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" podUID="eb6db775-6577-4ad6-90db-1fffd09b924b" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.603093 4994 generic.go:334] "Generic (PLEG): container finished" podID="c50e95d0-ef8f-4355-ba85-156de14a4408" containerID="6d58b9675a43308db681361c0bef9b74160a0c0119ed01f9c003d6f031166ed0" exitCode=0 Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.603168 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerDied","Data":"6d58b9675a43308db681361c0bef9b74160a0c0119ed01f9c003d6f031166ed0"} Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.603970 4994 scope.go:117] "RemoveContainer" containerID="6d58b9675a43308db681361c0bef9b74160a0c0119ed01f9c003d6f031166ed0" Mar 10 00:34:59 crc kubenswrapper[4994]: E0310 00:34:59.604288 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6_service-telemetry(c50e95d0-ef8f-4355-ba85-156de14a4408)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" podUID="c50e95d0-ef8f-4355-ba85-156de14a4408" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.625423 4994 generic.go:334] "Generic (PLEG): container finished" podID="ba63838d-f012-4322-afa9-d46cb2387ae8" containerID="169ddcb6a7b00ea999678151450f0db053d745d229a33e2dd89700bc08cf9670" exitCode=0 Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.626101 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerDied","Data":"169ddcb6a7b00ea999678151450f0db053d745d229a33e2dd89700bc08cf9670"} Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.626351 4994 scope.go:117] "RemoveContainer" containerID="169ddcb6a7b00ea999678151450f0db053d745d229a33e2dd89700bc08cf9670" Mar 10 00:34:59 crc kubenswrapper[4994]: E0310 00:34:59.626502 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl_service-telemetry(ba63838d-f012-4322-afa9-d46cb2387ae8)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" podUID="ba63838d-f012-4322-afa9-d46cb2387ae8" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.631353 4994 scope.go:117] "RemoveContainer" containerID="c84531a4e93d62654ebb196f19936778850438953ec8673d6fafbb0edd8785a6" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.684464 4994 scope.go:117] "RemoveContainer" containerID="8b70a93015d9609790745f53b72f74b1cd43d650efee4dc67311f62ad75287ab" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.736900 4994 scope.go:117] "RemoveContainer" containerID="9acae11a5f98bb0e9dd9df4d9e9005d260dc85f7f3616aa89360f47e34d6bf50" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.766466 4994 scope.go:117] "RemoveContainer" containerID="e6068625af9c2045ef46c0df0c506de4bbc03d0c2d17fa5809ae88cbfbacc695" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.343520 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.344620 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.347113 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.348172 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.350549 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.387552 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlst6\" (UniqueName: \"kubernetes.io/projected/f445e0d9-75b9-47ec-b98b-9e881b7f1856-kube-api-access-vlst6\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.387604 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/f445e0d9-75b9-47ec-b98b-9e881b7f1856-qdr-test-config\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.387722 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/f445e0d9-75b9-47ec-b98b-9e881b7f1856-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.489330 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlst6\" (UniqueName: \"kubernetes.io/projected/f445e0d9-75b9-47ec-b98b-9e881b7f1856-kube-api-access-vlst6\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.489386 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/f445e0d9-75b9-47ec-b98b-9e881b7f1856-qdr-test-config\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.489464 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/f445e0d9-75b9-47ec-b98b-9e881b7f1856-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.490213 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/f445e0d9-75b9-47ec-b98b-9e881b7f1856-qdr-test-config\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.494866 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/f445e0d9-75b9-47ec-b98b-9e881b7f1856-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.504449 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlst6\" (UniqueName: \"kubernetes.io/projected/f445e0d9-75b9-47ec-b98b-9e881b7f1856-kube-api-access-vlst6\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.657116 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 10 00:35:01 crc kubenswrapper[4994]: I0310 00:35:01.132566 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 10 00:35:01 crc kubenswrapper[4994]: I0310 00:35:01.656398 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"f445e0d9-75b9-47ec-b98b-9e881b7f1856","Type":"ContainerStarted","Data":"b419da21cbd4c826d10d500abdec14efec0593b7527ab9954c764986b1cdd160"} Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.188622 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rzlg8"] Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.190570 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.217621 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzlg8"] Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.312854 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-catalog-content\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.312947 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-utilities\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.312971 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvbb6\" (UniqueName: \"kubernetes.io/projected/6aff907a-203f-42b1-9ecb-20ab1860a00d-kube-api-access-pvbb6\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.413994 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-catalog-content\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.414314 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-utilities\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.414336 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvbb6\" (UniqueName: \"kubernetes.io/projected/6aff907a-203f-42b1-9ecb-20ab1860a00d-kube-api-access-pvbb6\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.414603 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-catalog-content\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.414834 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-utilities\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.434055 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvbb6\" (UniqueName: \"kubernetes.io/projected/6aff907a-203f-42b1-9ecb-20ab1860a00d-kube-api-access-pvbb6\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.512730 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:10 crc kubenswrapper[4994]: I0310 00:35:10.554507 4994 scope.go:117] "RemoveContainer" containerID="aa86ab0a88d44c9621b517e017ee873893234cfa280d568b1f4abc59862b5cb2" Mar 10 00:35:11 crc kubenswrapper[4994]: I0310 00:35:11.554369 4994 scope.go:117] "RemoveContainer" containerID="f7d20e00b2e0ce7b14482f30354361fd2e1ca73ef829a0328bc5831de70cc576" Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.136184 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzlg8"] Mar 10 00:35:12 crc kubenswrapper[4994]: W0310 00:35:12.154109 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aff907a_203f_42b1_9ecb_20ab1860a00d.slice/crio-067eb1ca7a214122994392b44bb2fc025416f971156c0d5b5d1b76db28176dcb WatchSource:0}: Error finding container 067eb1ca7a214122994392b44bb2fc025416f971156c0d5b5d1b76db28176dcb: Status 404 returned error can't find the container with id 067eb1ca7a214122994392b44bb2fc025416f971156c0d5b5d1b76db28176dcb Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.553685 4994 scope.go:117] "RemoveContainer" containerID="fdf59e41141258a1b8472c2d3611bcc044fc9d05558c10266949e1896f729f90" Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.554255 4994 scope.go:117] "RemoveContainer" containerID="6d58b9675a43308db681361c0bef9b74160a0c0119ed01f9c003d6f031166ed0" Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.744905 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerStarted","Data":"399699caec1e522bdeb077a5d07c27a9eea106dc4d93a5ba9e97d0bbd5978ba1"} Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.751910 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerStarted","Data":"326aebfc04a1e52598b39914c5e7f78c2742d4b50b3bee1bf930ce58eadb9a8b"} Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.753526 4994 generic.go:334] "Generic (PLEG): container finished" podID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerID="f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548" exitCode=0 Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.753608 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerDied","Data":"f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548"} Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.753631 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerStarted","Data":"067eb1ca7a214122994392b44bb2fc025416f971156c0d5b5d1b76db28176dcb"} Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.758004 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"f445e0d9-75b9-47ec-b98b-9e881b7f1856","Type":"ContainerStarted","Data":"5884f0e424445fcf01622c118482f0c4fc472cf12808b855b40b5afcbe8d1cb3"} Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.816842 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.170389696 podStartE2EDuration="12.816805365s" podCreationTimestamp="2026-03-10 00:35:00 +0000 UTC" firstStartedPulling="2026-03-10 00:35:01.140942311 +0000 UTC m=+1715.314649060" lastFinishedPulling="2026-03-10 00:35:11.78735797 +0000 UTC m=+1725.961064729" observedRunningTime="2026-03-10 00:35:12.81661551 +0000 UTC m=+1726.990322269" watchObservedRunningTime="2026-03-10 00:35:12.816805365 +0000 UTC m=+1726.990512104" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.110115 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-j8jlg"] Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.111331 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.114436 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.114556 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.114683 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.114741 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.114799 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.114923 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.132665 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-j8jlg"] Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200339 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200442 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-sensubility-config\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200490 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-publisher\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200556 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-config\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200657 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-healthcheck-log\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200763 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45wc7\" (UniqueName: \"kubernetes.io/projected/6b5caccf-955c-4075-b043-7f1bea611f1e-kube-api-access-45wc7\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200863 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.302758 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-publisher\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.303369 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-config\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.303391 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-healthcheck-log\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.303412 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45wc7\" (UniqueName: \"kubernetes.io/projected/6b5caccf-955c-4075-b043-7f1bea611f1e-kube-api-access-45wc7\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.303438 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.303494 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.303533 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-sensubility-config\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.304828 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-sensubility-config\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.305565 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-publisher\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.306240 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-config\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.306832 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-healthcheck-log\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.307985 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.308616 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.361543 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45wc7\" (UniqueName: \"kubernetes.io/projected/6b5caccf-955c-4075-b043-7f1bea611f1e-kube-api-access-45wc7\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.411506 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.412908 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.419076 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.428153 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.507067 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-262z7\" (UniqueName: \"kubernetes.io/projected/71067204-bbe4-4fcc-9fb0-6a349363479a-kube-api-access-262z7\") pod \"curl\" (UID: \"71067204-bbe4-4fcc-9fb0-6a349363479a\") " pod="service-telemetry/curl" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.553973 4994 scope.go:117] "RemoveContainer" containerID="169ddcb6a7b00ea999678151450f0db053d745d229a33e2dd89700bc08cf9670" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.608307 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-262z7\" (UniqueName: \"kubernetes.io/projected/71067204-bbe4-4fcc-9fb0-6a349363479a-kube-api-access-262z7\") pod \"curl\" (UID: \"71067204-bbe4-4fcc-9fb0-6a349363479a\") " pod="service-telemetry/curl" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.630612 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-262z7\" (UniqueName: \"kubernetes.io/projected/71067204-bbe4-4fcc-9fb0-6a349363479a-kube-api-access-262z7\") pod \"curl\" (UID: \"71067204-bbe4-4fcc-9fb0-6a349363479a\") " pod="service-telemetry/curl" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.735175 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-j8jlg"] Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.739472 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.782719 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerStarted","Data":"9c04e4299610c8d0304f27bd7be11cb95958733395d98aa4af3fca9a79d6c6f7"} Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.793994 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerStarted","Data":"66ec1d9568ec573d582b0b8dbddb66e09c796d92044c36ab0c4f1ee9b447d358"} Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.795901 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" event={"ID":"6b5caccf-955c-4075-b043-7f1bea611f1e","Type":"ContainerStarted","Data":"2d98b06a95f6259f9b2d744815dd5dc10490cb14f9c60bcb23cba61992b0701f"} Mar 10 00:35:14 crc kubenswrapper[4994]: I0310 00:35:14.193569 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 10 00:35:14 crc kubenswrapper[4994]: I0310 00:35:14.818897 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerStarted","Data":"20fe6fe24015dd6c61a3c23e21dfce6af7eff722f2267f212e0d75bd1acfc6e3"} Mar 10 00:35:14 crc kubenswrapper[4994]: I0310 00:35:14.822623 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"71067204-bbe4-4fcc-9fb0-6a349363479a","Type":"ContainerStarted","Data":"3b29bec2ca18d264db1f11e8f6eb788920224123906afbb2881a52bc48dd332f"} Mar 10 00:35:14 crc kubenswrapper[4994]: I0310 00:35:14.826327 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerStarted","Data":"7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb"} Mar 10 00:35:16 crc kubenswrapper[4994]: I0310 00:35:16.845699 4994 generic.go:334] "Generic (PLEG): container finished" podID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerID="7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb" exitCode=0 Mar 10 00:35:16 crc kubenswrapper[4994]: I0310 00:35:16.845784 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerDied","Data":"7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb"} Mar 10 00:35:18 crc kubenswrapper[4994]: I0310 00:35:18.892812 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:35:18 crc kubenswrapper[4994]: I0310 00:35:18.893226 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:35:25 crc kubenswrapper[4994]: I0310 00:35:25.916578 4994 generic.go:334] "Generic (PLEG): container finished" podID="71067204-bbe4-4fcc-9fb0-6a349363479a" containerID="6d8fe1647417184e8e25c69d6debae961f810ea3d0dc03dd5ec0460f1f287477" exitCode=0 Mar 10 00:35:25 crc kubenswrapper[4994]: I0310 00:35:25.916698 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"71067204-bbe4-4fcc-9fb0-6a349363479a","Type":"ContainerDied","Data":"6d8fe1647417184e8e25c69d6debae961f810ea3d0dc03dd5ec0460f1f287477"} Mar 10 00:35:25 crc kubenswrapper[4994]: I0310 00:35:25.918948 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerStarted","Data":"963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d"} Mar 10 00:35:25 crc kubenswrapper[4994]: I0310 00:35:25.920843 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" event={"ID":"6b5caccf-955c-4075-b043-7f1bea611f1e","Type":"ContainerStarted","Data":"d9ce53eb5bc559e2a48e954e260519cec3def3db573dfb9cad96b1f81692f09f"} Mar 10 00:35:25 crc kubenswrapper[4994]: I0310 00:35:25.960797 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rzlg8" podStartSLOduration=4.437664607 podStartE2EDuration="16.960777846s" podCreationTimestamp="2026-03-10 00:35:09 +0000 UTC" firstStartedPulling="2026-03-10 00:35:12.756002209 +0000 UTC m=+1726.929708958" lastFinishedPulling="2026-03-10 00:35:25.279115408 +0000 UTC m=+1739.452822197" observedRunningTime="2026-03-10 00:35:25.954472944 +0000 UTC m=+1740.128179693" watchObservedRunningTime="2026-03-10 00:35:25.960777846 +0000 UTC m=+1740.134484615" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.176131 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.307703 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-262z7\" (UniqueName: \"kubernetes.io/projected/71067204-bbe4-4fcc-9fb0-6a349363479a-kube-api-access-262z7\") pod \"71067204-bbe4-4fcc-9fb0-6a349363479a\" (UID: \"71067204-bbe4-4fcc-9fb0-6a349363479a\") " Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.312725 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_71067204-bbe4-4fcc-9fb0-6a349363479a/curl/0.log" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.336711 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71067204-bbe4-4fcc-9fb0-6a349363479a-kube-api-access-262z7" (OuterVolumeSpecName: "kube-api-access-262z7") pod "71067204-bbe4-4fcc-9fb0-6a349363479a" (UID: "71067204-bbe4-4fcc-9fb0-6a349363479a"). InnerVolumeSpecName "kube-api-access-262z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.409403 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-262z7\" (UniqueName: \"kubernetes.io/projected/71067204-bbe4-4fcc-9fb0-6a349363479a-kube-api-access-262z7\") on node \"crc\" DevicePath \"\"" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.575959 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-fzqv5_8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b/prometheus-webhook-snmp/0.log" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.933554 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"71067204-bbe4-4fcc-9fb0-6a349363479a","Type":"ContainerDied","Data":"3b29bec2ca18d264db1f11e8f6eb788920224123906afbb2881a52bc48dd332f"} Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.933590 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b29bec2ca18d264db1f11e8f6eb788920224123906afbb2881a52bc48dd332f" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.933649 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 10 00:35:29 crc kubenswrapper[4994]: I0310 00:35:29.513232 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:29 crc kubenswrapper[4994]: I0310 00:35:29.513502 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:30 crc kubenswrapper[4994]: I0310 00:35:30.551602 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzlg8" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" probeResult="failure" output=< Mar 10 00:35:30 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:35:30 crc kubenswrapper[4994]: > Mar 10 00:35:32 crc kubenswrapper[4994]: I0310 00:35:32.977241 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" event={"ID":"6b5caccf-955c-4075-b043-7f1bea611f1e","Type":"ContainerStarted","Data":"17d0bcd92de3ad324c7f89758949a37e1f93ee189f044b9ef739243d90393ed8"} Mar 10 00:35:33 crc kubenswrapper[4994]: I0310 00:35:33.013809 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" podStartSLOduration=1.539929115 podStartE2EDuration="20.013782521s" podCreationTimestamp="2026-03-10 00:35:13 +0000 UTC" firstStartedPulling="2026-03-10 00:35:13.75400552 +0000 UTC m=+1727.927712269" lastFinishedPulling="2026-03-10 00:35:32.227858926 +0000 UTC m=+1746.401565675" observedRunningTime="2026-03-10 00:35:33.011100852 +0000 UTC m=+1747.184807621" watchObservedRunningTime="2026-03-10 00:35:33.013782521 +0000 UTC m=+1747.187489310" Mar 10 00:35:40 crc kubenswrapper[4994]: I0310 00:35:40.579277 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzlg8" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" probeResult="failure" output=< Mar 10 00:35:40 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:35:40 crc kubenswrapper[4994]: > Mar 10 00:35:48 crc kubenswrapper[4994]: I0310 00:35:48.893959 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:35:48 crc kubenswrapper[4994]: I0310 00:35:48.894672 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:35:48 crc kubenswrapper[4994]: I0310 00:35:48.894747 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:35:48 crc kubenswrapper[4994]: I0310 00:35:48.897505 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:35:48 crc kubenswrapper[4994]: I0310 00:35:48.897645 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" gracePeriod=600 Mar 10 00:35:49 crc kubenswrapper[4994]: E0310 00:35:49.533386 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:35:50 crc kubenswrapper[4994]: I0310 00:35:50.122011 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" exitCode=0 Mar 10 00:35:50 crc kubenswrapper[4994]: I0310 00:35:50.122047 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b"} Mar 10 00:35:50 crc kubenswrapper[4994]: I0310 00:35:50.122075 4994 scope.go:117] "RemoveContainer" containerID="d8d935625d60ec1fe79acd428aa0c427cb2a184ba8e0a37f25ea8bb9485e5629" Mar 10 00:35:50 crc kubenswrapper[4994]: I0310 00:35:50.122534 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:35:50 crc kubenswrapper[4994]: E0310 00:35:50.122741 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:35:50 crc kubenswrapper[4994]: I0310 00:35:50.581277 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzlg8" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" probeResult="failure" output=< Mar 10 00:35:50 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:35:50 crc kubenswrapper[4994]: > Mar 10 00:35:57 crc kubenswrapper[4994]: I0310 00:35:57.703821 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-fzqv5_8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b/prometheus-webhook-snmp/0.log" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.149064 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551716-7pfgn"] Mar 10 00:36:00 crc kubenswrapper[4994]: E0310 00:36:00.150334 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71067204-bbe4-4fcc-9fb0-6a349363479a" containerName="curl" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.150437 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="71067204-bbe4-4fcc-9fb0-6a349363479a" containerName="curl" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.150655 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="71067204-bbe4-4fcc-9fb0-6a349363479a" containerName="curl" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.151311 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.153565 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.153956 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.154191 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.185146 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551716-7pfgn"] Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.217391 4994 generic.go:334] "Generic (PLEG): container finished" podID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerID="d9ce53eb5bc559e2a48e954e260519cec3def3db573dfb9cad96b1f81692f09f" exitCode=0 Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.217479 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" event={"ID":"6b5caccf-955c-4075-b043-7f1bea611f1e","Type":"ContainerDied","Data":"d9ce53eb5bc559e2a48e954e260519cec3def3db573dfb9cad96b1f81692f09f"} Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.218511 4994 scope.go:117] "RemoveContainer" containerID="d9ce53eb5bc559e2a48e954e260519cec3def3db573dfb9cad96b1f81692f09f" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.288783 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sqnl\" (UniqueName: \"kubernetes.io/projected/de54238c-3c16-4557-aaa0-fb321dc61ca7-kube-api-access-9sqnl\") pod \"auto-csr-approver-29551716-7pfgn\" (UID: \"de54238c-3c16-4557-aaa0-fb321dc61ca7\") " pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.391704 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sqnl\" (UniqueName: \"kubernetes.io/projected/de54238c-3c16-4557-aaa0-fb321dc61ca7-kube-api-access-9sqnl\") pod \"auto-csr-approver-29551716-7pfgn\" (UID: \"de54238c-3c16-4557-aaa0-fb321dc61ca7\") " pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.434206 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sqnl\" (UniqueName: \"kubernetes.io/projected/de54238c-3c16-4557-aaa0-fb321dc61ca7-kube-api-access-9sqnl\") pod \"auto-csr-approver-29551716-7pfgn\" (UID: \"de54238c-3c16-4557-aaa0-fb321dc61ca7\") " pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.476171 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.569948 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzlg8" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" probeResult="failure" output=< Mar 10 00:36:00 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:36:00 crc kubenswrapper[4994]: > Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.712028 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551716-7pfgn"] Mar 10 00:36:01 crc kubenswrapper[4994]: I0310 00:36:01.237750 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" event={"ID":"de54238c-3c16-4557-aaa0-fb321dc61ca7","Type":"ContainerStarted","Data":"286423bcf633f2c2ee39746fd2fc31ea8582414925a75e045afa246fd899accb"} Mar 10 00:36:01 crc kubenswrapper[4994]: I0310 00:36:01.554264 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:36:01 crc kubenswrapper[4994]: E0310 00:36:01.554633 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:36:02 crc kubenswrapper[4994]: I0310 00:36:02.246626 4994 generic.go:334] "Generic (PLEG): container finished" podID="de54238c-3c16-4557-aaa0-fb321dc61ca7" containerID="c123e634ccea8da5ca365371ebfd215620629bd57b4fb38743d94a62179c0683" exitCode=0 Mar 10 00:36:02 crc kubenswrapper[4994]: I0310 00:36:02.246718 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" event={"ID":"de54238c-3c16-4557-aaa0-fb321dc61ca7","Type":"ContainerDied","Data":"c123e634ccea8da5ca365371ebfd215620629bd57b4fb38743d94a62179c0683"} Mar 10 00:36:03 crc kubenswrapper[4994]: I0310 00:36:03.556955 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:03 crc kubenswrapper[4994]: I0310 00:36:03.652102 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sqnl\" (UniqueName: \"kubernetes.io/projected/de54238c-3c16-4557-aaa0-fb321dc61ca7-kube-api-access-9sqnl\") pod \"de54238c-3c16-4557-aaa0-fb321dc61ca7\" (UID: \"de54238c-3c16-4557-aaa0-fb321dc61ca7\") " Mar 10 00:36:03 crc kubenswrapper[4994]: I0310 00:36:03.662672 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de54238c-3c16-4557-aaa0-fb321dc61ca7-kube-api-access-9sqnl" (OuterVolumeSpecName: "kube-api-access-9sqnl") pod "de54238c-3c16-4557-aaa0-fb321dc61ca7" (UID: "de54238c-3c16-4557-aaa0-fb321dc61ca7"). InnerVolumeSpecName "kube-api-access-9sqnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:36:03 crc kubenswrapper[4994]: I0310 00:36:03.754933 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sqnl\" (UniqueName: \"kubernetes.io/projected/de54238c-3c16-4557-aaa0-fb321dc61ca7-kube-api-access-9sqnl\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.267450 4994 generic.go:334] "Generic (PLEG): container finished" podID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerID="17d0bcd92de3ad324c7f89758949a37e1f93ee189f044b9ef739243d90393ed8" exitCode=0 Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.267579 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" event={"ID":"6b5caccf-955c-4075-b043-7f1bea611f1e","Type":"ContainerDied","Data":"17d0bcd92de3ad324c7f89758949a37e1f93ee189f044b9ef739243d90393ed8"} Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.272206 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" event={"ID":"de54238c-3c16-4557-aaa0-fb321dc61ca7","Type":"ContainerDied","Data":"286423bcf633f2c2ee39746fd2fc31ea8582414925a75e045afa246fd899accb"} Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.272267 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="286423bcf633f2c2ee39746fd2fc31ea8582414925a75e045afa246fd899accb" Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.272267 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.638414 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551710-vrwfw"] Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.646106 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551710-vrwfw"] Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.577486 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.682834 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-entrypoint-script\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.682904 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-sensubility-config\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.682932 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45wc7\" (UniqueName: \"kubernetes.io/projected/6b5caccf-955c-4075-b043-7f1bea611f1e-kube-api-access-45wc7\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.682960 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-entrypoint-script\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.682976 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-healthcheck-log\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.682994 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-config\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.683087 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-publisher\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.704739 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.705530 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.705662 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b5caccf-955c-4075-b043-7f1bea611f1e-kube-api-access-45wc7" (OuterVolumeSpecName: "kube-api-access-45wc7") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "kube-api-access-45wc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.710190 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.712189 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.721046 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.726964 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784421 4994 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784460 4994 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784471 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45wc7\" (UniqueName: \"kubernetes.io/projected/6b5caccf-955c-4075-b043-7f1bea611f1e-kube-api-access-45wc7\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784483 4994 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784495 4994 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784504 4994 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784512 4994 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:06 crc kubenswrapper[4994]: I0310 00:36:06.289107 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" event={"ID":"6b5caccf-955c-4075-b043-7f1bea611f1e","Type":"ContainerDied","Data":"2d98b06a95f6259f9b2d744815dd5dc10490cb14f9c60bcb23cba61992b0701f"} Mar 10 00:36:06 crc kubenswrapper[4994]: I0310 00:36:06.289463 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d98b06a95f6259f9b2d744815dd5dc10490cb14f9c60bcb23cba61992b0701f" Mar 10 00:36:06 crc kubenswrapper[4994]: I0310 00:36:06.289168 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:36:06 crc kubenswrapper[4994]: I0310 00:36:06.569024 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="907fae93-d4a7-46e8-9fab-3c964fcb52ab" path="/var/lib/kubelet/pods/907fae93-d4a7-46e8-9fab-3c964fcb52ab/volumes" Mar 10 00:36:07 crc kubenswrapper[4994]: I0310 00:36:07.671687 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-j8jlg_6b5caccf-955c-4075-b043-7f1bea611f1e/smoketest-collectd/0.log" Mar 10 00:36:07 crc kubenswrapper[4994]: I0310 00:36:07.980100 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-j8jlg_6b5caccf-955c-4075-b043-7f1bea611f1e/smoketest-ceilometer/0.log" Mar 10 00:36:08 crc kubenswrapper[4994]: I0310 00:36:08.274754 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-2p86s_732a0e77-64b0-4fd5-a2b8-05cb2f80de80/default-interconnect/0.log" Mar 10 00:36:08 crc kubenswrapper[4994]: I0310 00:36:08.552147 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl_ba63838d-f012-4322-afa9-d46cb2387ae8/bridge/2.log" Mar 10 00:36:08 crc kubenswrapper[4994]: I0310 00:36:08.782241 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl_ba63838d-f012-4322-afa9-d46cb2387ae8/sg-core/0.log" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.013798 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-b85695595-kd89k_945b9546-8486-448b-a7b8-ec76634ff030/bridge/2.log" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.248775 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-b85695595-kd89k_945b9546-8486-448b-a7b8-ec76634ff030/sg-core/0.log" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.470123 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn_71658d15-ee94-436a-8266-e6ef3680d0f0/bridge/2.log" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.564992 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.637244 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.706140 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn_71658d15-ee94-436a-8266-e6ef3680d0f0/sg-core/0.log" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.813636 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzlg8"] Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.936766 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6_c50e95d0-ef8f-4355-ba85-156de14a4408/bridge/2.log" Mar 10 00:36:10 crc kubenswrapper[4994]: I0310 00:36:10.166239 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6_c50e95d0-ef8f-4355-ba85-156de14a4408/sg-core/0.log" Mar 10 00:36:10 crc kubenswrapper[4994]: I0310 00:36:10.372995 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl_eb6db775-6577-4ad6-90db-1fffd09b924b/bridge/2.log" Mar 10 00:36:10 crc kubenswrapper[4994]: I0310 00:36:10.597030 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl_eb6db775-6577-4ad6-90db-1fffd09b924b/sg-core/0.log" Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.327534 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rzlg8" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" containerID="cri-o://963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d" gracePeriod=2 Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.746649 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.901960 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-utilities\") pod \"6aff907a-203f-42b1-9ecb-20ab1860a00d\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.902040 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvbb6\" (UniqueName: \"kubernetes.io/projected/6aff907a-203f-42b1-9ecb-20ab1860a00d-kube-api-access-pvbb6\") pod \"6aff907a-203f-42b1-9ecb-20ab1860a00d\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.902104 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-catalog-content\") pod \"6aff907a-203f-42b1-9ecb-20ab1860a00d\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.905540 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-utilities" (OuterVolumeSpecName: "utilities") pod "6aff907a-203f-42b1-9ecb-20ab1860a00d" (UID: "6aff907a-203f-42b1-9ecb-20ab1860a00d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.911059 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aff907a-203f-42b1-9ecb-20ab1860a00d-kube-api-access-pvbb6" (OuterVolumeSpecName: "kube-api-access-pvbb6") pod "6aff907a-203f-42b1-9ecb-20ab1860a00d" (UID: "6aff907a-203f-42b1-9ecb-20ab1860a00d"). InnerVolumeSpecName "kube-api-access-pvbb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.003644 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.003907 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvbb6\" (UniqueName: \"kubernetes.io/projected/6aff907a-203f-42b1-9ecb-20ab1860a00d-kube-api-access-pvbb6\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.016290 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6aff907a-203f-42b1-9ecb-20ab1860a00d" (UID: "6aff907a-203f-42b1-9ecb-20ab1860a00d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.105644 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.364111 4994 generic.go:334] "Generic (PLEG): container finished" podID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerID="963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d" exitCode=0 Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.364158 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerDied","Data":"963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d"} Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.364191 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerDied","Data":"067eb1ca7a214122994392b44bb2fc025416f971156c0d5b5d1b76db28176dcb"} Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.364207 4994 scope.go:117] "RemoveContainer" containerID="963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.364354 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.391516 4994 scope.go:117] "RemoveContainer" containerID="7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.419929 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzlg8"] Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.424917 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rzlg8"] Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.443453 4994 scope.go:117] "RemoveContainer" containerID="f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.459281 4994 scope.go:117] "RemoveContainer" containerID="963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d" Mar 10 00:36:12 crc kubenswrapper[4994]: E0310 00:36:12.459590 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d\": container with ID starting with 963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d not found: ID does not exist" containerID="963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.459617 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d"} err="failed to get container status \"963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d\": rpc error: code = NotFound desc = could not find container \"963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d\": container with ID starting with 963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d not found: ID does not exist" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.459637 4994 scope.go:117] "RemoveContainer" containerID="7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb" Mar 10 00:36:12 crc kubenswrapper[4994]: E0310 00:36:12.460000 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb\": container with ID starting with 7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb not found: ID does not exist" containerID="7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.460024 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb"} err="failed to get container status \"7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb\": rpc error: code = NotFound desc = could not find container \"7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb\": container with ID starting with 7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb not found: ID does not exist" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.460038 4994 scope.go:117] "RemoveContainer" containerID="f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548" Mar 10 00:36:12 crc kubenswrapper[4994]: E0310 00:36:12.460269 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548\": container with ID starting with f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548 not found: ID does not exist" containerID="f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.460291 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548"} err="failed to get container status \"f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548\": rpc error: code = NotFound desc = could not find container \"f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548\": container with ID starting with f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548 not found: ID does not exist" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.563428 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" path="/var/lib/kubelet/pods/6aff907a-203f-42b1-9ecb-20ab1860a00d/volumes" Mar 10 00:36:13 crc kubenswrapper[4994]: I0310 00:36:13.825353 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-7457956966-kbwlx_0730e042-f632-4db2-a694-b5917982d77d/operator/0.log" Mar 10 00:36:14 crc kubenswrapper[4994]: I0310 00:36:14.101942 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_69942722-a3c1-459b-96d3-260e0813093b/prometheus/0.log" Mar 10 00:36:14 crc kubenswrapper[4994]: I0310 00:36:14.348370 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7/elasticsearch/0.log" Mar 10 00:36:14 crc kubenswrapper[4994]: I0310 00:36:14.554514 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:36:14 crc kubenswrapper[4994]: E0310 00:36:14.554862 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:36:14 crc kubenswrapper[4994]: I0310 00:36:14.630116 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-fzqv5_8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b/prometheus-webhook-snmp/0.log" Mar 10 00:36:14 crc kubenswrapper[4994]: I0310 00:36:14.871783 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_bd991a1f-d471-40c4-919f-75400e047b5d/alertmanager/0.log" Mar 10 00:36:27 crc kubenswrapper[4994]: I0310 00:36:27.555579 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:36:27 crc kubenswrapper[4994]: E0310 00:36:27.556842 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:36:31 crc kubenswrapper[4994]: I0310 00:36:31.103582 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-656df8f446-7rqn6_134b5ce4-37cf-459f-9c27-dafae8eb9e86/operator/0.log" Mar 10 00:36:32 crc kubenswrapper[4994]: I0310 00:36:32.641583 4994 scope.go:117] "RemoveContainer" containerID="dc93dce81f7d66a840274eb6f49e057db6ba425ffe2f1cac85085352655d2af7" Mar 10 00:36:34 crc kubenswrapper[4994]: I0310 00:36:34.456335 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-7457956966-kbwlx_0730e042-f632-4db2-a694-b5917982d77d/operator/0.log" Mar 10 00:36:34 crc kubenswrapper[4994]: I0310 00:36:34.780546 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_f445e0d9-75b9-47ec-b98b-9e881b7f1856/qdr/0.log" Mar 10 00:36:39 crc kubenswrapper[4994]: I0310 00:36:39.553976 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:36:39 crc kubenswrapper[4994]: E0310 00:36:39.555225 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:36:50 crc kubenswrapper[4994]: I0310 00:36:50.554498 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:36:50 crc kubenswrapper[4994]: E0310 00:36:50.555181 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.274696 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jxgzr/must-gather-78m7n"] Mar 10 00:36:58 crc kubenswrapper[4994]: E0310 00:36:58.275562 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerName="smoketest-collectd" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275578 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerName="smoketest-collectd" Mar 10 00:36:58 crc kubenswrapper[4994]: E0310 00:36:58.275598 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="extract-content" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275605 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="extract-content" Mar 10 00:36:58 crc kubenswrapper[4994]: E0310 00:36:58.275623 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="extract-utilities" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275631 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="extract-utilities" Mar 10 00:36:58 crc kubenswrapper[4994]: E0310 00:36:58.275642 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerName="smoketest-ceilometer" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275649 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerName="smoketest-ceilometer" Mar 10 00:36:58 crc kubenswrapper[4994]: E0310 00:36:58.275662 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275670 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" Mar 10 00:36:58 crc kubenswrapper[4994]: E0310 00:36:58.275689 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de54238c-3c16-4557-aaa0-fb321dc61ca7" containerName="oc" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275697 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="de54238c-3c16-4557-aaa0-fb321dc61ca7" containerName="oc" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275833 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerName="smoketest-ceilometer" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275851 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275863 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerName="smoketest-collectd" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275891 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="de54238c-3c16-4557-aaa0-fb321dc61ca7" containerName="oc" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.276694 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.281574 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jxgzr"/"openshift-service-ca.crt" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.281796 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jxgzr"/"default-dockercfg-2vzbr" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.282422 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jxgzr"/"kube-root-ca.crt" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.289243 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jxgzr/must-gather-78m7n"] Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.338767 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c1f251c-2be2-460a-aa78-fca33bed879f-must-gather-output\") pod \"must-gather-78m7n\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.338959 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhc7\" (UniqueName: \"kubernetes.io/projected/4c1f251c-2be2-460a-aa78-fca33bed879f-kube-api-access-xdhc7\") pod \"must-gather-78m7n\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.440341 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c1f251c-2be2-460a-aa78-fca33bed879f-must-gather-output\") pod \"must-gather-78m7n\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.440747 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhc7\" (UniqueName: \"kubernetes.io/projected/4c1f251c-2be2-460a-aa78-fca33bed879f-kube-api-access-xdhc7\") pod \"must-gather-78m7n\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.441437 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c1f251c-2be2-460a-aa78-fca33bed879f-must-gather-output\") pod \"must-gather-78m7n\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.459477 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhc7\" (UniqueName: \"kubernetes.io/projected/4c1f251c-2be2-460a-aa78-fca33bed879f-kube-api-access-xdhc7\") pod \"must-gather-78m7n\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.594778 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.829056 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jxgzr/must-gather-78m7n"] Mar 10 00:36:59 crc kubenswrapper[4994]: I0310 00:36:59.815412 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jxgzr/must-gather-78m7n" event={"ID":"4c1f251c-2be2-460a-aa78-fca33bed879f","Type":"ContainerStarted","Data":"256c5d91a415fa84cf93382c61caac00eddcb4f0bdb42042e78a5b5005818752"} Mar 10 00:37:02 crc kubenswrapper[4994]: I0310 00:37:02.553901 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:37:02 crc kubenswrapper[4994]: E0310 00:37:02.554475 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:37:05 crc kubenswrapper[4994]: I0310 00:37:05.862789 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jxgzr/must-gather-78m7n" event={"ID":"4c1f251c-2be2-460a-aa78-fca33bed879f","Type":"ContainerStarted","Data":"b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7"} Mar 10 00:37:05 crc kubenswrapper[4994]: I0310 00:37:05.863529 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jxgzr/must-gather-78m7n" event={"ID":"4c1f251c-2be2-460a-aa78-fca33bed879f","Type":"ContainerStarted","Data":"54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7"} Mar 10 00:37:05 crc kubenswrapper[4994]: I0310 00:37:05.887311 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jxgzr/must-gather-78m7n" podStartSLOduration=1.712485998 podStartE2EDuration="7.887278737s" podCreationTimestamp="2026-03-10 00:36:58 +0000 UTC" firstStartedPulling="2026-03-10 00:36:58.836305236 +0000 UTC m=+1833.010011985" lastFinishedPulling="2026-03-10 00:37:05.011097925 +0000 UTC m=+1839.184804724" observedRunningTime="2026-03-10 00:37:05.88160341 +0000 UTC m=+1840.055310159" watchObservedRunningTime="2026-03-10 00:37:05.887278737 +0000 UTC m=+1840.060985526" Mar 10 00:37:13 crc kubenswrapper[4994]: I0310 00:37:13.553566 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:37:13 crc kubenswrapper[4994]: E0310 00:37:13.554633 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:37:27 crc kubenswrapper[4994]: I0310 00:37:27.554670 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:37:27 crc kubenswrapper[4994]: E0310 00:37:27.555819 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:37:33 crc kubenswrapper[4994]: I0310 00:37:33.398199 4994 scope.go:117] "RemoveContainer" containerID="efddc60aba5bd5a67702715a9ba5bd81e253ee925326f34bfc3e8b98fe80390e" Mar 10 00:37:33 crc kubenswrapper[4994]: I0310 00:37:33.672762 4994 scope.go:117] "RemoveContainer" containerID="74761a9eff3f39fcf080fcf43da99f605fa6e3e4193c77e19940c640d07270e2" Mar 10 00:37:33 crc kubenswrapper[4994]: I0310 00:37:33.710314 4994 scope.go:117] "RemoveContainer" containerID="d932d5abafc702ac0d919613d1196b6f6540e0380b55b01cbf3b8f80be098bd1" Mar 10 00:37:33 crc kubenswrapper[4994]: I0310 00:37:33.752023 4994 scope.go:117] "RemoveContainer" containerID="7225b87c2068896f2fce33d1aefcc4a4a471fea15131e40f56a1db24cbb94f3e" Mar 10 00:37:40 crc kubenswrapper[4994]: I0310 00:37:40.554770 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:37:40 crc kubenswrapper[4994]: E0310 00:37:40.555557 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:37:55 crc kubenswrapper[4994]: I0310 00:37:55.553964 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:37:55 crc kubenswrapper[4994]: E0310 00:37:55.554867 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:37:57 crc kubenswrapper[4994]: I0310 00:37:57.225142 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-vjj5j_f15954a6-2036-4c32-a8b6-bc8e227d0fcd/control-plane-machine-set-operator/0.log" Mar 10 00:37:57 crc kubenswrapper[4994]: I0310 00:37:57.379327 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-m6jnx_fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d/kube-rbac-proxy/0.log" Mar 10 00:37:57 crc kubenswrapper[4994]: I0310 00:37:57.379532 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-m6jnx_fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d/machine-api-operator/0.log" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.148389 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551718-8lc8n"] Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.150266 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.155424 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.155486 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.156320 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.158088 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551718-8lc8n"] Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.307804 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt67d\" (UniqueName: \"kubernetes.io/projected/b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe-kube-api-access-rt67d\") pod \"auto-csr-approver-29551718-8lc8n\" (UID: \"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe\") " pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.409188 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt67d\" (UniqueName: \"kubernetes.io/projected/b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe-kube-api-access-rt67d\") pod \"auto-csr-approver-29551718-8lc8n\" (UID: \"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe\") " pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.437454 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt67d\" (UniqueName: \"kubernetes.io/projected/b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe-kube-api-access-rt67d\") pod \"auto-csr-approver-29551718-8lc8n\" (UID: \"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe\") " pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.483108 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.751167 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551718-8lc8n"] Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.757590 4994 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:38:01 crc kubenswrapper[4994]: I0310 00:38:01.378249 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" event={"ID":"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe","Type":"ContainerStarted","Data":"3b9fa2cf82ce023be37c84782ce38836922cc621c1c4818b6b06a13f5e0d6969"} Mar 10 00:38:02 crc kubenswrapper[4994]: I0310 00:38:02.389730 4994 generic.go:334] "Generic (PLEG): container finished" podID="b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe" containerID="75da607defaa0e2e04f032a83aef4a970ffb1092be73ab93d5a96db312ad3ddb" exitCode=0 Mar 10 00:38:02 crc kubenswrapper[4994]: I0310 00:38:02.389802 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" event={"ID":"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe","Type":"ContainerDied","Data":"75da607defaa0e2e04f032a83aef4a970ffb1092be73ab93d5a96db312ad3ddb"} Mar 10 00:38:03 crc kubenswrapper[4994]: I0310 00:38:03.727938 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:03 crc kubenswrapper[4994]: I0310 00:38:03.863298 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt67d\" (UniqueName: \"kubernetes.io/projected/b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe-kube-api-access-rt67d\") pod \"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe\" (UID: \"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe\") " Mar 10 00:38:03 crc kubenswrapper[4994]: I0310 00:38:03.869106 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe-kube-api-access-rt67d" (OuterVolumeSpecName: "kube-api-access-rt67d") pod "b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe" (UID: "b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe"). InnerVolumeSpecName "kube-api-access-rt67d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:38:03 crc kubenswrapper[4994]: I0310 00:38:03.964786 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt67d\" (UniqueName: \"kubernetes.io/projected/b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe-kube-api-access-rt67d\") on node \"crc\" DevicePath \"\"" Mar 10 00:38:04 crc kubenswrapper[4994]: I0310 00:38:04.408681 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" event={"ID":"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe","Type":"ContainerDied","Data":"3b9fa2cf82ce023be37c84782ce38836922cc621c1c4818b6b06a13f5e0d6969"} Mar 10 00:38:04 crc kubenswrapper[4994]: I0310 00:38:04.408722 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b9fa2cf82ce023be37c84782ce38836922cc621c1c4818b6b06a13f5e0d6969" Mar 10 00:38:04 crc kubenswrapper[4994]: I0310 00:38:04.408798 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:04 crc kubenswrapper[4994]: I0310 00:38:04.816819 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551712-nx9pb"] Mar 10 00:38:04 crc kubenswrapper[4994]: I0310 00:38:04.827690 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551712-nx9pb"] Mar 10 00:38:06 crc kubenswrapper[4994]: I0310 00:38:06.564235 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615394b2-0705-4358-853e-8c52eb448519" path="/var/lib/kubelet/pods/615394b2-0705-4358-853e-8c52eb448519/volumes" Mar 10 00:38:08 crc kubenswrapper[4994]: I0310 00:38:08.555368 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:38:08 crc kubenswrapper[4994]: E0310 00:38:08.557322 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:38:11 crc kubenswrapper[4994]: I0310 00:38:11.488801 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-jjkfq_943085e6-2580-48ae-9c2d-d83989c6204c/cert-manager-controller/0.log" Mar 10 00:38:11 crc kubenswrapper[4994]: I0310 00:38:11.641685 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-8qd55_c5f1e9a7-bff0-4565-9cef-d8904908dbfe/cert-manager-cainjector/0.log" Mar 10 00:38:11 crc kubenswrapper[4994]: I0310 00:38:11.674666 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-6qgfs_ef38e78a-b3a6-4de7-ba46-598693edf905/cert-manager-webhook/0.log" Mar 10 00:38:20 crc kubenswrapper[4994]: I0310 00:38:20.554408 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:38:20 crc kubenswrapper[4994]: E0310 00:38:20.555424 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:38:27 crc kubenswrapper[4994]: I0310 00:38:27.072691 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-fnj29_13e52713-fbfe-43ba-ae51-b13a060d8a05/prometheus-operator/0.log" Mar 10 00:38:27 crc kubenswrapper[4994]: I0310 00:38:27.234956 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw_08b7eb36-ad76-4d9a-9fe9-f37febcdfdab/prometheus-operator-admission-webhook/0.log" Mar 10 00:38:27 crc kubenswrapper[4994]: I0310 00:38:27.250423 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r_22d07ce7-cdcc-4804-8127-a4f3a9d1685f/prometheus-operator-admission-webhook/0.log" Mar 10 00:38:27 crc kubenswrapper[4994]: I0310 00:38:27.398623 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-2jk2w_9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e/operator/0.log" Mar 10 00:38:27 crc kubenswrapper[4994]: I0310 00:38:27.409753 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-gxbhj_65c6820f-4375-4de8-bcdf-0f0e2c4bcd87/perses-operator/0.log" Mar 10 00:38:32 crc kubenswrapper[4994]: I0310 00:38:32.554640 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:38:32 crc kubenswrapper[4994]: E0310 00:38:32.555659 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:38:33 crc kubenswrapper[4994]: I0310 00:38:33.858138 4994 scope.go:117] "RemoveContainer" containerID="ab12f6f7b139f927c15eec55fa9992338c6ae56c8336c6e012df890d87e1461b" Mar 10 00:38:42 crc kubenswrapper[4994]: I0310 00:38:42.731612 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/util/0.log" Mar 10 00:38:42 crc kubenswrapper[4994]: I0310 00:38:42.860323 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/util/0.log" Mar 10 00:38:42 crc kubenswrapper[4994]: I0310 00:38:42.891982 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/pull/0.log" Mar 10 00:38:42 crc kubenswrapper[4994]: I0310 00:38:42.907964 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/pull/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.040531 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/extract/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.045690 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/pull/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.089152 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/util/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.217650 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/util/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.366036 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/pull/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.371558 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/pull/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.378033 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/util/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.574333 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/util/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.589604 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/pull/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.601313 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/extract/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.718647 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/util/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.898522 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/util/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.929771 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/pull/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.962787 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/pull/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.110218 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/pull/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.110723 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/util/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.130524 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/extract/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.274645 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/util/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.481247 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/pull/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.484601 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/pull/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.486322 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/util/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.554055 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:38:44 crc kubenswrapper[4994]: E0310 00:38:44.554510 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.656494 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/pull/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.659173 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/util/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.681282 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/extract/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.833313 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/extract-utilities/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.023690 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/extract-content/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.034147 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/extract-content/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.041495 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/extract-utilities/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.205424 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/extract-content/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.222567 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/extract-utilities/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.406010 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/extract-utilities/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.552998 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/registry-server/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.589217 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/extract-content/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.610538 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/extract-utilities/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.644736 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/extract-content/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.774983 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/extract-utilities/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.781987 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/extract-content/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.006649 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ppfwk_46c4619e-ab9f-4fd9-9f3e-5b7ba9415823/marketplace-operator/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.058762 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/extract-utilities/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.076952 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/registry-server/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.210269 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/extract-utilities/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.232937 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/extract-content/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.239982 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/extract-content/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.364322 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/extract-content/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.374547 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/extract-utilities/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.633337 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/registry-server/0.log" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.580759 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jq9wx"] Mar 10 00:38:47 crc kubenswrapper[4994]: E0310 00:38:47.581638 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe" containerName="oc" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.581663 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe" containerName="oc" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.581923 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe" containerName="oc" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.584463 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.589098 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jq9wx"] Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.663121 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-utilities\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.663231 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-kube-api-access-kqnlf\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.663306 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-catalog-content\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.765065 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-utilities\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.765120 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-kube-api-access-kqnlf\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.765139 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-catalog-content\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.765610 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-utilities\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.765635 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-catalog-content\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.785448 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-kube-api-access-kqnlf\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.934293 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:48 crc kubenswrapper[4994]: I0310 00:38:48.347160 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jq9wx"] Mar 10 00:38:48 crc kubenswrapper[4994]: I0310 00:38:48.744685 4994 generic.go:334] "Generic (PLEG): container finished" podID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerID="a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774" exitCode=0 Mar 10 00:38:48 crc kubenswrapper[4994]: I0310 00:38:48.744772 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerDied","Data":"a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774"} Mar 10 00:38:48 crc kubenswrapper[4994]: I0310 00:38:48.745008 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerStarted","Data":"36e610f9b1315284b8cc51fc123f9b0461ca200389e8e569cb119a259c7ea54d"} Mar 10 00:38:49 crc kubenswrapper[4994]: I0310 00:38:49.757439 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerStarted","Data":"24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62"} Mar 10 00:38:50 crc kubenswrapper[4994]: I0310 00:38:50.772290 4994 generic.go:334] "Generic (PLEG): container finished" podID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerID="24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62" exitCode=0 Mar 10 00:38:50 crc kubenswrapper[4994]: I0310 00:38:50.772330 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerDied","Data":"24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62"} Mar 10 00:38:51 crc kubenswrapper[4994]: I0310 00:38:51.787336 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerStarted","Data":"5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e"} Mar 10 00:38:51 crc kubenswrapper[4994]: I0310 00:38:51.812139 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jq9wx" podStartSLOduration=2.33666101 podStartE2EDuration="4.812112257s" podCreationTimestamp="2026-03-10 00:38:47 +0000 UTC" firstStartedPulling="2026-03-10 00:38:48.746888172 +0000 UTC m=+1942.920594922" lastFinishedPulling="2026-03-10 00:38:51.22233939 +0000 UTC m=+1945.396046169" observedRunningTime="2026-03-10 00:38:51.810074313 +0000 UTC m=+1945.983781072" watchObservedRunningTime="2026-03-10 00:38:51.812112257 +0000 UTC m=+1945.985819046" Mar 10 00:38:57 crc kubenswrapper[4994]: I0310 00:38:57.554725 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:38:57 crc kubenswrapper[4994]: E0310 00:38:57.555725 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:38:57 crc kubenswrapper[4994]: I0310 00:38:57.935454 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:57 crc kubenswrapper[4994]: I0310 00:38:57.935525 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:58 crc kubenswrapper[4994]: I0310 00:38:58.011767 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:58 crc kubenswrapper[4994]: I0310 00:38:58.921380 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:58 crc kubenswrapper[4994]: I0310 00:38:58.983980 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jq9wx"] Mar 10 00:39:00 crc kubenswrapper[4994]: I0310 00:39:00.875515 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jq9wx" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="registry-server" containerID="cri-o://5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e" gracePeriod=2 Mar 10 00:39:00 crc kubenswrapper[4994]: I0310 00:39:00.895807 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw_08b7eb36-ad76-4d9a-9fe9-f37febcdfdab/prometheus-operator-admission-webhook/0.log" Mar 10 00:39:00 crc kubenswrapper[4994]: I0310 00:39:00.912711 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-fnj29_13e52713-fbfe-43ba-ae51-b13a060d8a05/prometheus-operator/0.log" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.060935 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-2jk2w_9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e/operator/0.log" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.098545 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-gxbhj_65c6820f-4375-4de8-bcdf-0f0e2c4bcd87/perses-operator/0.log" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.126340 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r_22d07ce7-cdcc-4804-8127-a4f3a9d1685f/prometheus-operator-admission-webhook/0.log" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.244993 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.379256 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-utilities\") pod \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.379303 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-catalog-content\") pod \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.379325 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-kube-api-access-kqnlf\") pod \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.380097 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-utilities" (OuterVolumeSpecName: "utilities") pod "4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" (UID: "4fdd1ac4-0308-464f-82e1-2c9ef11ea84c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.384514 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-kube-api-access-kqnlf" (OuterVolumeSpecName: "kube-api-access-kqnlf") pod "4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" (UID: "4fdd1ac4-0308-464f-82e1-2c9ef11ea84c"). InnerVolumeSpecName "kube-api-access-kqnlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.430623 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" (UID: "4fdd1ac4-0308-464f-82e1-2c9ef11ea84c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.480998 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.481021 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.481030 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-kube-api-access-kqnlf\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.884296 4994 generic.go:334] "Generic (PLEG): container finished" podID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerID="5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e" exitCode=0 Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.884338 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerDied","Data":"5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e"} Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.884367 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerDied","Data":"36e610f9b1315284b8cc51fc123f9b0461ca200389e8e569cb119a259c7ea54d"} Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.884389 4994 scope.go:117] "RemoveContainer" containerID="5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.884395 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.912954 4994 scope.go:117] "RemoveContainer" containerID="24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.920466 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jq9wx"] Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.929422 4994 scope.go:117] "RemoveContainer" containerID="a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.949583 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jq9wx"] Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.953784 4994 scope.go:117] "RemoveContainer" containerID="5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e" Mar 10 00:39:01 crc kubenswrapper[4994]: E0310 00:39:01.954231 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e\": container with ID starting with 5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e not found: ID does not exist" containerID="5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.954285 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e"} err="failed to get container status \"5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e\": rpc error: code = NotFound desc = could not find container \"5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e\": container with ID starting with 5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e not found: ID does not exist" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.954319 4994 scope.go:117] "RemoveContainer" containerID="24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62" Mar 10 00:39:01 crc kubenswrapper[4994]: E0310 00:39:01.954741 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62\": container with ID starting with 24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62 not found: ID does not exist" containerID="24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.954790 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62"} err="failed to get container status \"24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62\": rpc error: code = NotFound desc = could not find container \"24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62\": container with ID starting with 24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62 not found: ID does not exist" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.954825 4994 scope.go:117] "RemoveContainer" containerID="a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774" Mar 10 00:39:01 crc kubenswrapper[4994]: E0310 00:39:01.955155 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774\": container with ID starting with a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774 not found: ID does not exist" containerID="a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.955185 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774"} err="failed to get container status \"a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774\": rpc error: code = NotFound desc = could not find container \"a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774\": container with ID starting with a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774 not found: ID does not exist" Mar 10 00:39:02 crc kubenswrapper[4994]: I0310 00:39:02.562816 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" path="/var/lib/kubelet/pods/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c/volumes" Mar 10 00:39:11 crc kubenswrapper[4994]: I0310 00:39:11.554481 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:39:11 crc kubenswrapper[4994]: E0310 00:39:11.557638 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:39:22 crc kubenswrapper[4994]: I0310 00:39:22.554308 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:39:22 crc kubenswrapper[4994]: E0310 00:39:22.555498 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:39:35 crc kubenswrapper[4994]: I0310 00:39:35.553748 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:39:35 crc kubenswrapper[4994]: E0310 00:39:35.554782 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.817273 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q92kr"] Mar 10 00:39:41 crc kubenswrapper[4994]: E0310 00:39:41.818266 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="registry-server" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.818291 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="registry-server" Mar 10 00:39:41 crc kubenswrapper[4994]: E0310 00:39:41.818332 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="extract-content" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.818344 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="extract-content" Mar 10 00:39:41 crc kubenswrapper[4994]: E0310 00:39:41.818368 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="extract-utilities" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.818382 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="extract-utilities" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.818631 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="registry-server" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.820659 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.839861 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q92kr"] Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.005337 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-fh2tl"] Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.006521 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.015546 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drw5q\" (UniqueName: \"kubernetes.io/projected/ebb35288-8012-435c-acf9-93fa066af8fe-kube-api-access-drw5q\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.015608 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-utilities\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.015651 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-catalog-content\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.031103 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-fh2tl"] Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.116712 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-utilities\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.116804 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-catalog-content\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.116936 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m48fb\" (UniqueName: \"kubernetes.io/projected/2808be1e-48f1-4d08-98aa-c58ef6d4c153-kube-api-access-m48fb\") pod \"infrawatch-operators-fh2tl\" (UID: \"2808be1e-48f1-4d08-98aa-c58ef6d4c153\") " pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.117023 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drw5q\" (UniqueName: \"kubernetes.io/projected/ebb35288-8012-435c-acf9-93fa066af8fe-kube-api-access-drw5q\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.117646 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-utilities\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.117681 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-catalog-content\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.139444 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drw5q\" (UniqueName: \"kubernetes.io/projected/ebb35288-8012-435c-acf9-93fa066af8fe-kube-api-access-drw5q\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.188335 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.218271 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m48fb\" (UniqueName: \"kubernetes.io/projected/2808be1e-48f1-4d08-98aa-c58ef6d4c153-kube-api-access-m48fb\") pod \"infrawatch-operators-fh2tl\" (UID: \"2808be1e-48f1-4d08-98aa-c58ef6d4c153\") " pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.239446 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m48fb\" (UniqueName: \"kubernetes.io/projected/2808be1e-48f1-4d08-98aa-c58ef6d4c153-kube-api-access-m48fb\") pod \"infrawatch-operators-fh2tl\" (UID: \"2808be1e-48f1-4d08-98aa-c58ef6d4c153\") " pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.334073 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.491401 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q92kr"] Mar 10 00:39:42 crc kubenswrapper[4994]: W0310 00:39:42.799108 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2808be1e_48f1_4d08_98aa_c58ef6d4c153.slice/crio-fbf0824f915ac92385acafbf7095682500c296e3f241107075559fc7e1785852 WatchSource:0}: Error finding container fbf0824f915ac92385acafbf7095682500c296e3f241107075559fc7e1785852: Status 404 returned error can't find the container with id fbf0824f915ac92385acafbf7095682500c296e3f241107075559fc7e1785852 Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.799735 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-fh2tl"] Mar 10 00:39:43 crc kubenswrapper[4994]: I0310 00:39:43.282136 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-fh2tl" event={"ID":"2808be1e-48f1-4d08-98aa-c58ef6d4c153","Type":"ContainerStarted","Data":"b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02"} Mar 10 00:39:43 crc kubenswrapper[4994]: I0310 00:39:43.282226 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-fh2tl" event={"ID":"2808be1e-48f1-4d08-98aa-c58ef6d4c153","Type":"ContainerStarted","Data":"fbf0824f915ac92385acafbf7095682500c296e3f241107075559fc7e1785852"} Mar 10 00:39:43 crc kubenswrapper[4994]: I0310 00:39:43.285650 4994 generic.go:334] "Generic (PLEG): container finished" podID="ebb35288-8012-435c-acf9-93fa066af8fe" containerID="7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb" exitCode=0 Mar 10 00:39:43 crc kubenswrapper[4994]: I0310 00:39:43.285699 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q92kr" event={"ID":"ebb35288-8012-435c-acf9-93fa066af8fe","Type":"ContainerDied","Data":"7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb"} Mar 10 00:39:43 crc kubenswrapper[4994]: I0310 00:39:43.285735 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q92kr" event={"ID":"ebb35288-8012-435c-acf9-93fa066af8fe","Type":"ContainerStarted","Data":"f7f1fc60078657da38db0b539efc33423a99b403512bcdfb6d438710e442e259"} Mar 10 00:39:43 crc kubenswrapper[4994]: I0310 00:39:43.310815 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-fh2tl" podStartSLOduration=2.180513493 podStartE2EDuration="2.31079108s" podCreationTimestamp="2026-03-10 00:39:41 +0000 UTC" firstStartedPulling="2026-03-10 00:39:42.80402288 +0000 UTC m=+1996.977729669" lastFinishedPulling="2026-03-10 00:39:42.934300477 +0000 UTC m=+1997.108007256" observedRunningTime="2026-03-10 00:39:43.3050637 +0000 UTC m=+1997.478770519" watchObservedRunningTime="2026-03-10 00:39:43.31079108 +0000 UTC m=+1997.484497859" Mar 10 00:39:45 crc kubenswrapper[4994]: I0310 00:39:45.319142 4994 generic.go:334] "Generic (PLEG): container finished" podID="ebb35288-8012-435c-acf9-93fa066af8fe" containerID="d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119" exitCode=0 Mar 10 00:39:45 crc kubenswrapper[4994]: I0310 00:39:45.319218 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q92kr" event={"ID":"ebb35288-8012-435c-acf9-93fa066af8fe","Type":"ContainerDied","Data":"d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119"} Mar 10 00:39:46 crc kubenswrapper[4994]: I0310 00:39:46.333819 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q92kr" event={"ID":"ebb35288-8012-435c-acf9-93fa066af8fe","Type":"ContainerStarted","Data":"563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46"} Mar 10 00:39:47 crc kubenswrapper[4994]: I0310 00:39:47.554866 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:39:47 crc kubenswrapper[4994]: E0310 00:39:47.558021 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:39:49 crc kubenswrapper[4994]: I0310 00:39:49.359657 4994 generic.go:334] "Generic (PLEG): container finished" podID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerID="54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7" exitCode=0 Mar 10 00:39:49 crc kubenswrapper[4994]: I0310 00:39:49.359743 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jxgzr/must-gather-78m7n" event={"ID":"4c1f251c-2be2-460a-aa78-fca33bed879f","Type":"ContainerDied","Data":"54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7"} Mar 10 00:39:49 crc kubenswrapper[4994]: I0310 00:39:49.360481 4994 scope.go:117] "RemoveContainer" containerID="54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7" Mar 10 00:39:49 crc kubenswrapper[4994]: I0310 00:39:49.388584 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q92kr" podStartSLOduration=5.891254908 podStartE2EDuration="8.388558638s" podCreationTimestamp="2026-03-10 00:39:41 +0000 UTC" firstStartedPulling="2026-03-10 00:39:43.299527545 +0000 UTC m=+1997.473234324" lastFinishedPulling="2026-03-10 00:39:45.796831265 +0000 UTC m=+1999.970538054" observedRunningTime="2026-03-10 00:39:46.377028942 +0000 UTC m=+2000.550735701" watchObservedRunningTime="2026-03-10 00:39:49.388558638 +0000 UTC m=+2003.562265427" Mar 10 00:39:50 crc kubenswrapper[4994]: I0310 00:39:50.294740 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jxgzr_must-gather-78m7n_4c1f251c-2be2-460a-aa78-fca33bed879f/gather/0.log" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.188699 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.189100 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.260645 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.335489 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.335569 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.384163 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.439723 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.470420 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:55 crc kubenswrapper[4994]: I0310 00:39:55.599900 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q92kr"] Mar 10 00:39:55 crc kubenswrapper[4994]: I0310 00:39:55.600530 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q92kr" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="registry-server" containerID="cri-o://563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46" gracePeriod=2 Mar 10 00:39:55 crc kubenswrapper[4994]: I0310 00:39:55.803660 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-fh2tl"] Mar 10 00:39:55 crc kubenswrapper[4994]: I0310 00:39:55.804305 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-fh2tl" podUID="2808be1e-48f1-4d08-98aa-c58ef6d4c153" containerName="registry-server" containerID="cri-o://b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02" gracePeriod=2 Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.095084 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.231317 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.282949 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-utilities\") pod \"ebb35288-8012-435c-acf9-93fa066af8fe\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.283109 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m48fb\" (UniqueName: \"kubernetes.io/projected/2808be1e-48f1-4d08-98aa-c58ef6d4c153-kube-api-access-m48fb\") pod \"2808be1e-48f1-4d08-98aa-c58ef6d4c153\" (UID: \"2808be1e-48f1-4d08-98aa-c58ef6d4c153\") " Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.283156 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-catalog-content\") pod \"ebb35288-8012-435c-acf9-93fa066af8fe\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.283188 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drw5q\" (UniqueName: \"kubernetes.io/projected/ebb35288-8012-435c-acf9-93fa066af8fe-kube-api-access-drw5q\") pod \"ebb35288-8012-435c-acf9-93fa066af8fe\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.284580 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-utilities" (OuterVolumeSpecName: "utilities") pod "ebb35288-8012-435c-acf9-93fa066af8fe" (UID: "ebb35288-8012-435c-acf9-93fa066af8fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.289167 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb35288-8012-435c-acf9-93fa066af8fe-kube-api-access-drw5q" (OuterVolumeSpecName: "kube-api-access-drw5q") pod "ebb35288-8012-435c-acf9-93fa066af8fe" (UID: "ebb35288-8012-435c-acf9-93fa066af8fe"). InnerVolumeSpecName "kube-api-access-drw5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.290178 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2808be1e-48f1-4d08-98aa-c58ef6d4c153-kube-api-access-m48fb" (OuterVolumeSpecName: "kube-api-access-m48fb") pod "2808be1e-48f1-4d08-98aa-c58ef6d4c153" (UID: "2808be1e-48f1-4d08-98aa-c58ef6d4c153"). InnerVolumeSpecName "kube-api-access-m48fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.333576 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebb35288-8012-435c-acf9-93fa066af8fe" (UID: "ebb35288-8012-435c-acf9-93fa066af8fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.385296 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.385329 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m48fb\" (UniqueName: \"kubernetes.io/projected/2808be1e-48f1-4d08-98aa-c58ef6d4c153-kube-api-access-m48fb\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.385339 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.385347 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drw5q\" (UniqueName: \"kubernetes.io/projected/ebb35288-8012-435c-acf9-93fa066af8fe-kube-api-access-drw5q\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.446376 4994 generic.go:334] "Generic (PLEG): container finished" podID="2808be1e-48f1-4d08-98aa-c58ef6d4c153" containerID="b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02" exitCode=0 Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.446584 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.446618 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-fh2tl" event={"ID":"2808be1e-48f1-4d08-98aa-c58ef6d4c153","Type":"ContainerDied","Data":"b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02"} Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.446670 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-fh2tl" event={"ID":"2808be1e-48f1-4d08-98aa-c58ef6d4c153","Type":"ContainerDied","Data":"fbf0824f915ac92385acafbf7095682500c296e3f241107075559fc7e1785852"} Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.446710 4994 scope.go:117] "RemoveContainer" containerID="b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.450678 4994 generic.go:334] "Generic (PLEG): container finished" podID="ebb35288-8012-435c-acf9-93fa066af8fe" containerID="563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46" exitCode=0 Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.450768 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q92kr" event={"ID":"ebb35288-8012-435c-acf9-93fa066af8fe","Type":"ContainerDied","Data":"563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46"} Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.450822 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q92kr" event={"ID":"ebb35288-8012-435c-acf9-93fa066af8fe","Type":"ContainerDied","Data":"f7f1fc60078657da38db0b539efc33423a99b403512bcdfb6d438710e442e259"} Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.451057 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.472620 4994 scope.go:117] "RemoveContainer" containerID="b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02" Mar 10 00:39:56 crc kubenswrapper[4994]: E0310 00:39:56.473108 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02\": container with ID starting with b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02 not found: ID does not exist" containerID="b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.473160 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02"} err="failed to get container status \"b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02\": rpc error: code = NotFound desc = could not find container \"b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02\": container with ID starting with b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02 not found: ID does not exist" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.473192 4994 scope.go:117] "RemoveContainer" containerID="563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.497448 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-fh2tl"] Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.507830 4994 scope.go:117] "RemoveContainer" containerID="d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.510187 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-fh2tl"] Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.525472 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q92kr"] Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.532933 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q92kr"] Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.540438 4994 scope.go:117] "RemoveContainer" containerID="7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.560660 4994 scope.go:117] "RemoveContainer" containerID="563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46" Mar 10 00:39:56 crc kubenswrapper[4994]: E0310 00:39:56.571171 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46\": container with ID starting with 563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46 not found: ID does not exist" containerID="563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.571440 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46"} err="failed to get container status \"563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46\": rpc error: code = NotFound desc = could not find container \"563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46\": container with ID starting with 563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46 not found: ID does not exist" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.571596 4994 scope.go:117] "RemoveContainer" containerID="d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119" Mar 10 00:39:56 crc kubenswrapper[4994]: E0310 00:39:56.572295 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119\": container with ID starting with d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119 not found: ID does not exist" containerID="d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.572359 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119"} err="failed to get container status \"d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119\": rpc error: code = NotFound desc = could not find container \"d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119\": container with ID starting with d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119 not found: ID does not exist" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.572401 4994 scope.go:117] "RemoveContainer" containerID="7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb" Mar 10 00:39:56 crc kubenswrapper[4994]: E0310 00:39:56.573348 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb\": container with ID starting with 7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb not found: ID does not exist" containerID="7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.573558 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb"} err="failed to get container status \"7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb\": rpc error: code = NotFound desc = could not find container \"7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb\": container with ID starting with 7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb not found: ID does not exist" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.575072 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2808be1e-48f1-4d08-98aa-c58ef6d4c153" path="/var/lib/kubelet/pods/2808be1e-48f1-4d08-98aa-c58ef6d4c153/volumes" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.575692 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" path="/var/lib/kubelet/pods/ebb35288-8012-435c-acf9-93fa066af8fe/volumes" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.677558 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jxgzr/must-gather-78m7n"] Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.677987 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jxgzr/must-gather-78m7n" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="copy" containerID="cri-o://b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7" gracePeriod=2 Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.689773 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jxgzr/must-gather-78m7n"] Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.059163 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jxgzr_must-gather-78m7n_4c1f251c-2be2-460a-aa78-fca33bed879f/copy/0.log" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.060101 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.197850 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdhc7\" (UniqueName: \"kubernetes.io/projected/4c1f251c-2be2-460a-aa78-fca33bed879f-kube-api-access-xdhc7\") pod \"4c1f251c-2be2-460a-aa78-fca33bed879f\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.198023 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c1f251c-2be2-460a-aa78-fca33bed879f-must-gather-output\") pod \"4c1f251c-2be2-460a-aa78-fca33bed879f\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.203595 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c1f251c-2be2-460a-aa78-fca33bed879f-kube-api-access-xdhc7" (OuterVolumeSpecName: "kube-api-access-xdhc7") pod "4c1f251c-2be2-460a-aa78-fca33bed879f" (UID: "4c1f251c-2be2-460a-aa78-fca33bed879f"). InnerVolumeSpecName "kube-api-access-xdhc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.258076 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c1f251c-2be2-460a-aa78-fca33bed879f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4c1f251c-2be2-460a-aa78-fca33bed879f" (UID: "4c1f251c-2be2-460a-aa78-fca33bed879f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.300849 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdhc7\" (UniqueName: \"kubernetes.io/projected/4c1f251c-2be2-460a-aa78-fca33bed879f-kube-api-access-xdhc7\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.300926 4994 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c1f251c-2be2-460a-aa78-fca33bed879f-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.460610 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jxgzr_must-gather-78m7n_4c1f251c-2be2-460a-aa78-fca33bed879f/copy/0.log" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.460998 4994 generic.go:334] "Generic (PLEG): container finished" podID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerID="b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7" exitCode=143 Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.461047 4994 scope.go:117] "RemoveContainer" containerID="b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.461119 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.519023 4994 scope.go:117] "RemoveContainer" containerID="54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.578156 4994 scope.go:117] "RemoveContainer" containerID="b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7" Mar 10 00:39:57 crc kubenswrapper[4994]: E0310 00:39:57.578862 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7\": container with ID starting with b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7 not found: ID does not exist" containerID="b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.578923 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7"} err="failed to get container status \"b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7\": rpc error: code = NotFound desc = could not find container \"b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7\": container with ID starting with b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7 not found: ID does not exist" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.578949 4994 scope.go:117] "RemoveContainer" containerID="54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7" Mar 10 00:39:57 crc kubenswrapper[4994]: E0310 00:39:57.579237 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7\": container with ID starting with 54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7 not found: ID does not exist" containerID="54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.579278 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7"} err="failed to get container status \"54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7\": rpc error: code = NotFound desc = could not find container \"54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7\": container with ID starting with 54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7 not found: ID does not exist" Mar 10 00:39:58 crc kubenswrapper[4994]: I0310 00:39:58.576025 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" path="/var/lib/kubelet/pods/4c1f251c-2be2-460a-aa78-fca33bed879f/volumes" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.138148 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551720-fld4g"] Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.138957 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="gather" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.138972 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="gather" Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.138980 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="extract-utilities" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.138987 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="extract-utilities" Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.139002 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="extract-content" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139009 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="extract-content" Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.139022 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139027 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.139036 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="copy" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139043 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="copy" Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.139058 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2808be1e-48f1-4d08-98aa-c58ef6d4c153" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139064 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="2808be1e-48f1-4d08-98aa-c58ef6d4c153" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139166 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="gather" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139178 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="2808be1e-48f1-4d08-98aa-c58ef6d4c153" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139193 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="copy" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139202 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139609 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.145792 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.146114 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.146132 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.153089 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551720-fld4g"] Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.245750 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpctq\" (UniqueName: \"kubernetes.io/projected/e4e99279-1a96-4b0e-b307-3d7badc31d87-kube-api-access-lpctq\") pod \"auto-csr-approver-29551720-fld4g\" (UID: \"e4e99279-1a96-4b0e-b307-3d7badc31d87\") " pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.347312 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpctq\" (UniqueName: \"kubernetes.io/projected/e4e99279-1a96-4b0e-b307-3d7badc31d87-kube-api-access-lpctq\") pod \"auto-csr-approver-29551720-fld4g\" (UID: \"e4e99279-1a96-4b0e-b307-3d7badc31d87\") " pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.370605 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpctq\" (UniqueName: \"kubernetes.io/projected/e4e99279-1a96-4b0e-b307-3d7badc31d87-kube-api-access-lpctq\") pod \"auto-csr-approver-29551720-fld4g\" (UID: \"e4e99279-1a96-4b0e-b307-3d7badc31d87\") " pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.501822 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.555434 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.555699 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.821650 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551720-fld4g"] Mar 10 00:40:00 crc kubenswrapper[4994]: W0310 00:40:00.822477 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4e99279_1a96_4b0e_b307_3d7badc31d87.slice/crio-c90e9a986f2f2e53cd8946728ec99ceea9179d1f394dae89d448677f285894a4 WatchSource:0}: Error finding container c90e9a986f2f2e53cd8946728ec99ceea9179d1f394dae89d448677f285894a4: Status 404 returned error can't find the container with id c90e9a986f2f2e53cd8946728ec99ceea9179d1f394dae89d448677f285894a4 Mar 10 00:40:01 crc kubenswrapper[4994]: I0310 00:40:01.527354 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551720-fld4g" event={"ID":"e4e99279-1a96-4b0e-b307-3d7badc31d87","Type":"ContainerStarted","Data":"c90e9a986f2f2e53cd8946728ec99ceea9179d1f394dae89d448677f285894a4"} Mar 10 00:40:02 crc kubenswrapper[4994]: I0310 00:40:02.534683 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551720-fld4g" event={"ID":"e4e99279-1a96-4b0e-b307-3d7badc31d87","Type":"ContainerStarted","Data":"05c0570be18d37a454ce767ab9e7e5834f1cf6173510437b4d6ef60917692ea8"} Mar 10 00:40:02 crc kubenswrapper[4994]: I0310 00:40:02.568693 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551720-fld4g" podStartSLOduration=1.378823154 podStartE2EDuration="2.568677335s" podCreationTimestamp="2026-03-10 00:40:00 +0000 UTC" firstStartedPulling="2026-03-10 00:40:00.825006759 +0000 UTC m=+2014.998713518" lastFinishedPulling="2026-03-10 00:40:02.01486091 +0000 UTC m=+2016.188567699" observedRunningTime="2026-03-10 00:40:02.56545439 +0000 UTC m=+2016.739161139" watchObservedRunningTime="2026-03-10 00:40:02.568677335 +0000 UTC m=+2016.742384084" Mar 10 00:40:03 crc kubenswrapper[4994]: I0310 00:40:03.552583 4994 generic.go:334] "Generic (PLEG): container finished" podID="e4e99279-1a96-4b0e-b307-3d7badc31d87" containerID="05c0570be18d37a454ce767ab9e7e5834f1cf6173510437b4d6ef60917692ea8" exitCode=0 Mar 10 00:40:03 crc kubenswrapper[4994]: I0310 00:40:03.552734 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551720-fld4g" event={"ID":"e4e99279-1a96-4b0e-b307-3d7badc31d87","Type":"ContainerDied","Data":"05c0570be18d37a454ce767ab9e7e5834f1cf6173510437b4d6ef60917692ea8"} Mar 10 00:40:04 crc kubenswrapper[4994]: I0310 00:40:04.965828 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.135442 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpctq\" (UniqueName: \"kubernetes.io/projected/e4e99279-1a96-4b0e-b307-3d7badc31d87-kube-api-access-lpctq\") pod \"e4e99279-1a96-4b0e-b307-3d7badc31d87\" (UID: \"e4e99279-1a96-4b0e-b307-3d7badc31d87\") " Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.143672 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e99279-1a96-4b0e-b307-3d7badc31d87-kube-api-access-lpctq" (OuterVolumeSpecName: "kube-api-access-lpctq") pod "e4e99279-1a96-4b0e-b307-3d7badc31d87" (UID: "e4e99279-1a96-4b0e-b307-3d7badc31d87"). InnerVolumeSpecName "kube-api-access-lpctq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.237608 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpctq\" (UniqueName: \"kubernetes.io/projected/e4e99279-1a96-4b0e-b307-3d7badc31d87-kube-api-access-lpctq\") on node \"crc\" DevicePath \"\"" Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.577500 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551720-fld4g" event={"ID":"e4e99279-1a96-4b0e-b307-3d7badc31d87","Type":"ContainerDied","Data":"c90e9a986f2f2e53cd8946728ec99ceea9179d1f394dae89d448677f285894a4"} Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.577557 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.577565 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c90e9a986f2f2e53cd8946728ec99ceea9179d1f394dae89d448677f285894a4" Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.631092 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551714-s79ft"] Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.642414 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551714-s79ft"] Mar 10 00:40:06 crc kubenswrapper[4994]: I0310 00:40:06.568540 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c" path="/var/lib/kubelet/pods/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c/volumes" Mar 10 00:40:14 crc kubenswrapper[4994]: I0310 00:40:14.554757 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:40:14 crc kubenswrapper[4994]: E0310 00:40:14.555726 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:40:29 crc kubenswrapper[4994]: I0310 00:40:29.554376 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:40:29 crc kubenswrapper[4994]: E0310 00:40:29.555575 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:40:33 crc kubenswrapper[4994]: I0310 00:40:33.984813 4994 scope.go:117] "RemoveContainer" containerID="55ae1bc05b680756a0fab6fc454424e48677cf98e3af7624cd80e10e8ec94e10" Mar 10 00:40:43 crc kubenswrapper[4994]: I0310 00:40:43.562995 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:40:43 crc kubenswrapper[4994]: E0310 00:40:43.564103 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:40:58 crc kubenswrapper[4994]: I0310 00:40:58.554799 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:40:59 crc kubenswrapper[4994]: I0310 00:40:59.095538 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"6c1ad9a8c7ff342b60b2a6e64cc726375287091375c2b73911e495c7acc74748"} Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.156959 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551722-6kglk"] Mar 10 00:42:00 crc kubenswrapper[4994]: E0310 00:42:00.158372 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e99279-1a96-4b0e-b307-3d7badc31d87" containerName="oc" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.158397 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e99279-1a96-4b0e-b307-3d7badc31d87" containerName="oc" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.158657 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e99279-1a96-4b0e-b307-3d7badc31d87" containerName="oc" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.159388 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.165855 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.166409 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.167818 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.168777 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551722-6kglk"] Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.249018 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv8vc\" (UniqueName: \"kubernetes.io/projected/adcf87b3-4e04-4030-84eb-f132b3d94687-kube-api-access-sv8vc\") pod \"auto-csr-approver-29551722-6kglk\" (UID: \"adcf87b3-4e04-4030-84eb-f132b3d94687\") " pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.351376 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv8vc\" (UniqueName: \"kubernetes.io/projected/adcf87b3-4e04-4030-84eb-f132b3d94687-kube-api-access-sv8vc\") pod \"auto-csr-approver-29551722-6kglk\" (UID: \"adcf87b3-4e04-4030-84eb-f132b3d94687\") " pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.386292 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv8vc\" (UniqueName: \"kubernetes.io/projected/adcf87b3-4e04-4030-84eb-f132b3d94687-kube-api-access-sv8vc\") pod \"auto-csr-approver-29551722-6kglk\" (UID: \"adcf87b3-4e04-4030-84eb-f132b3d94687\") " pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.495644 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:01 crc kubenswrapper[4994]: I0310 00:42:01.066369 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551722-6kglk"] Mar 10 00:42:01 crc kubenswrapper[4994]: I0310 00:42:01.813531 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551722-6kglk" event={"ID":"adcf87b3-4e04-4030-84eb-f132b3d94687","Type":"ContainerStarted","Data":"7e436ade524ad6984f2d2252a39f09227dcc0b17383ba118b30f11a4a0b1373a"} Mar 10 00:42:02 crc kubenswrapper[4994]: I0310 00:42:02.828919 4994 generic.go:334] "Generic (PLEG): container finished" podID="adcf87b3-4e04-4030-84eb-f132b3d94687" containerID="36044df81f0ee5cf4a106de302e414c77b5e88caeb4ad173fa482133a0b5fa04" exitCode=0 Mar 10 00:42:02 crc kubenswrapper[4994]: I0310 00:42:02.829115 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551722-6kglk" event={"ID":"adcf87b3-4e04-4030-84eb-f132b3d94687","Type":"ContainerDied","Data":"36044df81f0ee5cf4a106de302e414c77b5e88caeb4ad173fa482133a0b5fa04"} Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.228335 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.420689 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv8vc\" (UniqueName: \"kubernetes.io/projected/adcf87b3-4e04-4030-84eb-f132b3d94687-kube-api-access-sv8vc\") pod \"adcf87b3-4e04-4030-84eb-f132b3d94687\" (UID: \"adcf87b3-4e04-4030-84eb-f132b3d94687\") " Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.431352 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adcf87b3-4e04-4030-84eb-f132b3d94687-kube-api-access-sv8vc" (OuterVolumeSpecName: "kube-api-access-sv8vc") pod "adcf87b3-4e04-4030-84eb-f132b3d94687" (UID: "adcf87b3-4e04-4030-84eb-f132b3d94687"). InnerVolumeSpecName "kube-api-access-sv8vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.522800 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv8vc\" (UniqueName: \"kubernetes.io/projected/adcf87b3-4e04-4030-84eb-f132b3d94687-kube-api-access-sv8vc\") on node \"crc\" DevicePath \"\"" Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.854597 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551722-6kglk" event={"ID":"adcf87b3-4e04-4030-84eb-f132b3d94687","Type":"ContainerDied","Data":"7e436ade524ad6984f2d2252a39f09227dcc0b17383ba118b30f11a4a0b1373a"} Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.854658 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e436ade524ad6984f2d2252a39f09227dcc0b17383ba118b30f11a4a0b1373a" Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.854678 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:05 crc kubenswrapper[4994]: I0310 00:42:05.315773 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551716-7pfgn"] Mar 10 00:42:05 crc kubenswrapper[4994]: I0310 00:42:05.326014 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551716-7pfgn"] Mar 10 00:42:06 crc kubenswrapper[4994]: I0310 00:42:06.569698 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de54238c-3c16-4557-aaa0-fb321dc61ca7" path="/var/lib/kubelet/pods/de54238c-3c16-4557-aaa0-fb321dc61ca7/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515153664370024457 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015153664371017375 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015153657605016522 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015153657606015473 5ustar corecore